Apr 20 13:30:44.300355 ip-10-0-132-232 systemd[1]: Starting Kubernetes Kubelet... Apr 20 13:30:44.806997 ip-10-0-132-232 kubenswrapper[2563]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 13:30:44.806997 ip-10-0-132-232 kubenswrapper[2563]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 13:30:44.806997 ip-10-0-132-232 kubenswrapper[2563]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 13:30:44.806997 ip-10-0-132-232 kubenswrapper[2563]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 13:30:44.806997 ip-10-0-132-232 kubenswrapper[2563]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 13:30:44.809002 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.808908 2563 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 13:30:44.816412 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816396 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:30:44.816412 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816411 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:30:44.816412 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816414 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816418 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816421 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816423 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816427 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816429 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816433 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816435 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816438 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816440 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816443 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816445 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816449 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816451 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816454 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816456 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816459 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816462 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816465 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816467 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:30:44.816507 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816470 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816472 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816475 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816477 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816481 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816485 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816488 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816491 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816494 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816497 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816499 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816502 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816504 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816508 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816512 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816516 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816519 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816522 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816524 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816527 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:30:44.816992 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816531 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816533 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816536 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816539 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816541 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816544 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816547 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816551 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816554 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816556 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816559 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816561 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816564 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816566 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816569 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816572 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816574 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816577 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816579 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:30:44.817589 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816582 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816585 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816588 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816590 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816592 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816595 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816598 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816600 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816603 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816605 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816608 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816610 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816613 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816615 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816618 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816620 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816623 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816625 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816628 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816630 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:30:44.818087 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816633 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816635 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816638 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816640 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.816642 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817027 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817032 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817035 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817038 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817042 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817062 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817066 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817069 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817072 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817075 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817078 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817081 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817085 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817087 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:30:44.818564 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817090 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817093 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817095 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817098 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817101 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817103 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817106 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817109 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817111 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817113 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817116 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817118 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817120 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817123 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817126 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817129 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817132 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817134 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817137 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:30:44.819038 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817139 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817143 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817145 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817148 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817150 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817153 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817155 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817158 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817161 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817163 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817166 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817168 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817171 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817175 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817177 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817180 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817182 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817184 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817187 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817189 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:30:44.819567 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817192 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817194 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817197 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817199 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817202 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817204 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817206 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817209 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817211 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817214 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817216 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817218 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817221 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817224 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817227 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817229 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817232 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817234 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817237 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817240 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:30:44.820117 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817242 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817244 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817247 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817249 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817252 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817257 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817261 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817263 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817266 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817268 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817270 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817273 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817276 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817348 2563 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817354 2563 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817361 2563 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817366 2563 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817370 2563 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817373 2563 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817377 2563 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817381 2563 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 13:30:44.820640 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817385 2563 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817388 2563 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817391 2563 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817395 2563 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817398 2563 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817401 2563 flags.go:64] FLAG: --cgroup-root="" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817404 2563 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817407 2563 flags.go:64] FLAG: --client-ca-file="" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817410 2563 flags.go:64] FLAG: --cloud-config="" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817412 2563 flags.go:64] FLAG: --cloud-provider="external" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817415 2563 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817420 2563 flags.go:64] FLAG: --cluster-domain="" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817423 2563 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817426 2563 flags.go:64] FLAG: --config-dir="" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817429 2563 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817432 2563 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817436 2563 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817440 2563 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817443 2563 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817447 2563 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817450 2563 flags.go:64] FLAG: --contention-profiling="false" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817453 2563 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817456 2563 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817459 2563 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817462 2563 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 13:30:44.821166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817466 2563 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817469 2563 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817472 2563 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817475 2563 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817479 2563 flags.go:64] FLAG: --enable-server="true" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817481 2563 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817486 2563 flags.go:64] FLAG: --event-burst="100" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817489 2563 flags.go:64] FLAG: --event-qps="50" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817492 2563 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817495 2563 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817497 2563 flags.go:64] FLAG: --eviction-hard="" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817501 2563 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817504 2563 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817507 2563 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817510 2563 flags.go:64] FLAG: --eviction-soft="" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817513 2563 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817516 2563 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817519 2563 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817521 2563 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817524 2563 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817527 2563 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817529 2563 flags.go:64] FLAG: --feature-gates="" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817533 2563 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817537 2563 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817540 2563 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 13:30:44.821801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817544 2563 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817547 2563 flags.go:64] FLAG: --healthz-port="10248" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817550 2563 flags.go:64] FLAG: --help="false" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817553 2563 flags.go:64] FLAG: --hostname-override="ip-10-0-132-232.ec2.internal" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817556 2563 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817559 2563 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817561 2563 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817564 2563 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817568 2563 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817571 2563 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817573 2563 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817577 2563 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817579 2563 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817582 2563 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817585 2563 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817588 2563 flags.go:64] FLAG: --kube-reserved="" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817592 2563 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817595 2563 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817598 2563 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817601 2563 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817603 2563 flags.go:64] FLAG: --lock-file="" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817606 2563 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817609 2563 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817612 2563 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 13:30:44.822445 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817617 2563 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817620 2563 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817622 2563 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817625 2563 flags.go:64] FLAG: --logging-format="text" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817628 2563 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817632 2563 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817635 2563 flags.go:64] FLAG: --manifest-url="" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817638 2563 flags.go:64] FLAG: --manifest-url-header="" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817643 2563 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817647 2563 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817651 2563 flags.go:64] FLAG: --max-pods="110" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817654 2563 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817657 2563 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817660 2563 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817662 2563 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817665 2563 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817668 2563 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817671 2563 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817679 2563 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817682 2563 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817685 2563 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817688 2563 flags.go:64] FLAG: --pod-cidr="" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817690 2563 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 13:30:44.823035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817696 2563 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817699 2563 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817702 2563 flags.go:64] FLAG: --pods-per-core="0" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817705 2563 flags.go:64] FLAG: --port="10250" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817708 2563 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817711 2563 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-048ba71ab60e7df80" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817714 2563 flags.go:64] FLAG: --qos-reserved="" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817717 2563 flags.go:64] FLAG: --read-only-port="10255" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817720 2563 flags.go:64] FLAG: --register-node="true" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817723 2563 flags.go:64] FLAG: --register-schedulable="true" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817725 2563 flags.go:64] FLAG: --register-with-taints="" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817729 2563 flags.go:64] FLAG: --registry-burst="10" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817732 2563 flags.go:64] FLAG: --registry-qps="5" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817735 2563 flags.go:64] FLAG: --reserved-cpus="" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817740 2563 flags.go:64] FLAG: --reserved-memory="" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817744 2563 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817747 2563 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817750 2563 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817754 2563 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817757 2563 flags.go:64] FLAG: --runonce="false" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817760 2563 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817763 2563 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817766 2563 flags.go:64] FLAG: --seccomp-default="false" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817768 2563 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817771 2563 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817774 2563 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 13:30:44.823619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817777 2563 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817780 2563 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817783 2563 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817786 2563 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817788 2563 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817791 2563 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817794 2563 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817797 2563 flags.go:64] FLAG: --system-cgroups="" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817799 2563 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817804 2563 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817807 2563 flags.go:64] FLAG: --tls-cert-file="" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817810 2563 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817814 2563 flags.go:64] FLAG: --tls-min-version="" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817817 2563 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817819 2563 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817822 2563 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817825 2563 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817827 2563 flags.go:64] FLAG: --v="2" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817831 2563 flags.go:64] FLAG: --version="false" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817836 2563 flags.go:64] FLAG: --vmodule="" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817846 2563 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.817849 2563 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817941 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817945 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:30:44.824262 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817948 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817951 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817955 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817957 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817960 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817962 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817967 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817970 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817974 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817976 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817979 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817982 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817985 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817987 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817989 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817992 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817994 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817997 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.817999 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818002 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:30:44.824855 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818004 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818006 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818009 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818011 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818014 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818017 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818020 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818022 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818026 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818028 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818031 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818034 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818036 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818039 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818041 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818059 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818062 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818065 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818068 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:30:44.825437 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818070 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818073 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818075 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818078 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818080 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818082 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818085 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818087 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818090 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818093 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818095 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818098 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818100 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818103 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818105 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818108 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818110 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818112 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818115 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818117 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:30:44.825900 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818120 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818123 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818126 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818129 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818132 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818135 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818138 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818140 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818142 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818145 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818148 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818152 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818155 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818158 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818161 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818163 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818165 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818168 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818170 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818173 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:30:44.826402 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818175 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818177 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818180 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818182 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.818185 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.819072 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.826655 2563 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.826672 2563 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826728 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826733 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826737 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826740 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826742 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826745 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826749 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826751 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:30:44.826894 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826755 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826757 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826760 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826763 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826765 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826768 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826770 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826773 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826776 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826778 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826780 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826783 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826786 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826790 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826793 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826795 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826798 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826800 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826803 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:30:44.827330 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826807 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826811 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826814 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826817 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826826 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826829 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826831 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826834 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826836 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826839 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826841 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826844 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826846 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826848 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826851 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826853 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826856 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826858 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826860 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826862 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:30:44.827799 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826865 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826867 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826870 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826872 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826874 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826876 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826879 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826881 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826884 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826886 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826888 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826890 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826893 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826895 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826898 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826901 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826904 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826920 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826924 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826927 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:30:44.828302 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826929 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826932 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826934 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826937 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826939 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826942 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826944 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826947 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826949 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826952 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826954 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826956 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826959 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826961 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826963 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826966 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826969 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826971 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:30:44.828792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.826974 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.826979 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827131 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827136 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827139 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827141 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827144 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827146 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827149 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827151 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827154 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827157 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827164 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827167 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827169 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 13:30:44.829299 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827172 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827174 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827177 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827180 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827183 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827186 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827189 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827192 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827195 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827197 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827200 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827203 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827205 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827208 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827210 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827212 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827215 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827217 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827220 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827222 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 13:30:44.829665 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827224 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827227 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827231 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827234 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827237 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827239 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827242 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827245 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827248 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827250 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827258 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827261 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827263 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827265 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827268 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827270 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827272 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827275 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827277 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827280 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 13:30:44.830183 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827283 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827285 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827288 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827290 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827292 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827294 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827297 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827299 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827302 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827304 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827306 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827309 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827311 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827314 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827316 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827318 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827320 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827323 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827325 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827328 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 13:30:44.830666 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827330 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827332 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827335 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827345 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827347 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827355 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827357 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827360 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827362 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827365 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827367 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827369 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:44.827372 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.827377 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.828180 2563 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 13:30:44.831232 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.830552 2563 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 13:30:44.831657 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.831644 2563 server.go:1019] "Starting client certificate rotation" Apr 20 13:30:44.831768 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.831750 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 13:30:44.831801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.831793 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 13:30:44.864379 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.864353 2563 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 13:30:44.869129 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.869095 2563 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 13:30:44.884376 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.884348 2563 log.go:25] "Validated CRI v1 runtime API" Apr 20 13:30:44.892397 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.892376 2563 log.go:25] "Validated CRI v1 image API" Apr 20 13:30:44.895387 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.895361 2563 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 13:30:44.895501 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.895430 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 13:30:44.900062 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.900021 2563 fs.go:135] Filesystem UUIDs: map[3abedcdf-9643-4a39-a2ff-3dd34b941da4:/dev/nvme0n1p4 72a8dbab-9048-436d-b8fa-56395a2c07a1:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 20 13:30:44.900150 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.900067 2563 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 13:30:44.906274 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.906166 2563 manager.go:217] Machine: {Timestamp:2026-04-20 13:30:44.903923565 +0000 UTC m=+0.467614826 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104529 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec270439fa6bba98599811b9c4ddf6c9 SystemUUID:ec270439-fa6b-ba98-5998-11b9c4ddf6c9 BootID:31f8d222-4be3-448a-aff5-fc022ab3a0e2 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5d:c9:0f:b6:65 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5d:c9:0f:b6:65 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:46:52:8d:e8:e1:16 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 13:30:44.906274 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.906269 2563 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 13:30:44.906383 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.906352 2563 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 13:30:44.907520 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.907493 2563 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 13:30:44.907660 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.907522 2563 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-232.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 13:30:44.907707 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.907669 2563 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 13:30:44.907707 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.907677 2563 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 13:30:44.907707 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.907689 2563 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 13:30:44.908620 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.908609 2563 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 13:30:44.910691 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.910679 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 20 13:30:44.910805 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.910796 2563 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 13:30:44.913463 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.913451 2563 kubelet.go:491] "Attempting to sync node with API server" Apr 20 13:30:44.913503 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.913471 2563 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 13:30:44.913503 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.913483 2563 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 13:30:44.913503 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.913492 2563 kubelet.go:397] "Adding apiserver pod source" Apr 20 13:30:44.913613 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.913507 2563 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 13:30:44.914749 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.914731 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 13:30:44.914818 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.914760 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 13:30:44.919026 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.918963 2563 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 13:30:44.921386 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.921370 2563 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 13:30:44.923142 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923130 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 13:30:44.923188 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923148 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 13:30:44.923188 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923157 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 13:30:44.923188 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923165 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 13:30:44.923188 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923172 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 13:30:44.923188 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923178 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 13:30:44.923188 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923184 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 13:30:44.923188 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923190 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 13:30:44.923370 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923198 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 13:30:44.923370 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923204 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 13:30:44.923370 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923212 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 13:30:44.923370 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.923223 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 13:30:44.924259 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.924249 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 13:30:44.924293 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.924261 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 13:30:44.926438 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:44.926408 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 13:30:44.926519 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:44.926480 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-232.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 13:30:44.926785 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.926771 2563 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-232.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 13:30:44.928510 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.928496 2563 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 13:30:44.928584 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.928532 2563 server.go:1295] "Started kubelet" Apr 20 13:30:44.928647 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.928621 2563 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 13:30:44.928745 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.928686 2563 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 13:30:44.928826 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.928772 2563 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 13:30:44.929351 ip-10-0-132-232 systemd[1]: Started Kubernetes Kubelet. Apr 20 13:30:44.930248 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.930234 2563 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 13:30:44.931884 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.931859 2563 server.go:317] "Adding debug handlers to kubelet server" Apr 20 13:30:44.938635 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.938614 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 13:30:44.938923 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:44.938904 2563 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 13:30:44.939294 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.939273 2563 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 13:30:44.940090 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940069 2563 factory.go:55] Registering systemd factory Apr 20 13:30:44.940197 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940102 2563 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 13:30:44.940197 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940121 2563 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 13:30:44.940197 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940133 2563 factory.go:223] Registration of the systemd container factory successfully Apr 20 13:30:44.940357 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940205 2563 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 13:30:44.940357 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940269 2563 reconstruct.go:97] "Volume reconstruction finished" Apr 20 13:30:44.940357 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940277 2563 reconciler.go:26] "Reconciler: start to sync state" Apr 20 13:30:44.940357 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:44.940317 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:44.940484 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940402 2563 factory.go:153] Registering CRI-O factory Apr 20 13:30:44.940484 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940415 2563 factory.go:223] Registration of the crio container factory successfully Apr 20 13:30:44.940484 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940457 2563 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 13:30:44.940484 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940479 2563 factory.go:103] Registering Raw factory Apr 20 13:30:44.940677 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940489 2563 manager.go:1196] Started watching for new ooms in manager Apr 20 13:30:44.940954 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.940934 2563 manager.go:319] Starting recovery of all containers Apr 20 13:30:44.942305 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:44.942275 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 13:30:44.942421 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:44.942343 2563 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-232.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 13:30:44.944224 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:44.942443 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-232.ec2.internal.18a813d1d44c13ec default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-232.ec2.internal,UID:ip-10-0-132-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-232.ec2.internal,},FirstTimestamp:2026-04-20 13:30:44.928508908 +0000 UTC m=+0.492200169,LastTimestamp:2026-04-20 13:30:44.928508908 +0000 UTC m=+0.492200169,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-232.ec2.internal,}" Apr 20 13:30:44.952979 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.952795 2563 manager.go:324] Recovery completed Apr 20 13:30:44.957475 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.957457 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:44.959897 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.959881 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:44.959974 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.959911 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:44.959974 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.959921 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:44.960506 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.960490 2563 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 13:30:44.960562 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.960506 2563 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 13:30:44.960562 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.960525 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 20 13:30:44.962664 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:44.962592 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-232.ec2.internal.18a813d1d62b08c7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-232.ec2.internal,UID:ip-10-0-132-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-232.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-232.ec2.internal,},FirstTimestamp:2026-04-20 13:30:44.959897799 +0000 UTC m=+0.523589063,LastTimestamp:2026-04-20 13:30:44.959897799 +0000 UTC m=+0.523589063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-232.ec2.internal,}" Apr 20 13:30:44.963240 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.963226 2563 policy_none.go:49] "None policy: Start" Apr 20 13:30:44.963284 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.963248 2563 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 13:30:44.963284 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:44.963270 2563 state_mem.go:35] "Initializing new in-memory state store" Apr 20 13:30:44.971081 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:44.970981 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-232.ec2.internal.18a813d1d62b4e8a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-232.ec2.internal,UID:ip-10-0-132-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-132-232.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-132-232.ec2.internal,},FirstTimestamp:2026-04-20 13:30:44.959915658 +0000 UTC m=+0.523606921,LastTimestamp:2026-04-20 13:30:44.959915658 +0000 UTC m=+0.523606921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-232.ec2.internal,}" Apr 20 13:30:44.978117 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:44.978029 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-232.ec2.internal.18a813d1d62b749e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-232.ec2.internal,UID:ip-10-0-132-232.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-132-232.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-132-232.ec2.internal,},FirstTimestamp:2026-04-20 13:30:44.959925406 +0000 UTC m=+0.523616669,LastTimestamp:2026-04-20 13:30:44.959925406 +0000 UTC m=+0.523616669,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-232.ec2.internal,}" Apr 20 13:30:45.009698 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.009676 2563 manager.go:341] "Starting Device Plugin manager" Apr 20 13:30:45.021412 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.009719 2563 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 13:30:45.021412 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.009732 2563 server.go:85] "Starting device plugin registration server" Apr 20 13:30:45.021412 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.010030 2563 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 13:30:45.021412 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.010064 2563 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 13:30:45.021412 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.010166 2563 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 13:30:45.021412 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.010247 2563 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 13:30:45.021412 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.010254 2563 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 13:30:45.021412 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.010990 2563 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 13:30:45.021412 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.011028 2563 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:45.021809 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.021791 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sx87l" Apr 20 13:30:45.029408 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.029389 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sx87l" Apr 20 13:30:45.059715 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.059649 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 13:30:45.061019 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.061002 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 13:30:45.061093 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.061031 2563 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 13:30:45.061093 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.061072 2563 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 13:30:45.061093 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.061082 2563 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 13:30:45.061236 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.061128 2563 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 13:30:45.065916 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.065892 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:45.110847 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.110815 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:45.113095 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.113079 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:45.113164 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.113111 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:45.113164 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.113124 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:45.113164 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.113148 2563 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.121237 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.121211 2563 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.121237 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.121238 2563 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-232.ec2.internal\": node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:45.132977 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.132951 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:45.161243 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.161208 2563 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal"] Apr 20 13:30:45.161381 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.161288 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:45.162290 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.162272 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:45.162348 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.162308 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:45.162348 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.162321 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:45.163634 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.163620 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:45.163816 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.163801 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.163865 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.163832 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:45.164466 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.164448 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:45.164466 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.164457 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:45.164581 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.164480 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:45.164581 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.164490 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:45.164581 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.164481 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:45.164581 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.164563 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:45.165602 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.165588 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.165655 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.165625 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 13:30:45.166360 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.166344 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientMemory" Apr 20 13:30:45.166456 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.166369 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 13:30:45.166456 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.166380 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeHasSufficientPID" Apr 20 13:30:45.198878 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.198829 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-232.ec2.internal\" not found" node="ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.203403 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.203384 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-232.ec2.internal\" not found" node="ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.233858 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.233825 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:45.242258 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.242232 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/410d242d04c4d94f2a1060285211a1a3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal\" (UID: \"410d242d04c4d94f2a1060285211a1a3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.242354 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.242259 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/410d242d04c4d94f2a1060285211a1a3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal\" (UID: \"410d242d04c4d94f2a1060285211a1a3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.242354 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.242288 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ea90522398b66e089408acd5ec34cb0-config\") pod \"kube-apiserver-proxy-ip-10-0-132-232.ec2.internal\" (UID: \"0ea90522398b66e089408acd5ec34cb0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.334666 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.334577 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:45.343058 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.343029 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/410d242d04c4d94f2a1060285211a1a3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal\" (UID: \"410d242d04c4d94f2a1060285211a1a3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.343167 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.343081 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/410d242d04c4d94f2a1060285211a1a3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal\" (UID: \"410d242d04c4d94f2a1060285211a1a3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.343167 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.343142 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/410d242d04c4d94f2a1060285211a1a3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal\" (UID: \"410d242d04c4d94f2a1060285211a1a3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.343270 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.343146 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ea90522398b66e089408acd5ec34cb0-config\") pod \"kube-apiserver-proxy-ip-10-0-132-232.ec2.internal\" (UID: \"0ea90522398b66e089408acd5ec34cb0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.343270 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.343215 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0ea90522398b66e089408acd5ec34cb0-config\") pod \"kube-apiserver-proxy-ip-10-0-132-232.ec2.internal\" (UID: \"0ea90522398b66e089408acd5ec34cb0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.343270 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.343230 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/410d242d04c4d94f2a1060285211a1a3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal\" (UID: \"410d242d04c4d94f2a1060285211a1a3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.435523 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.435476 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:45.502083 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.502030 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.505512 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.505493 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.536030 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.535995 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:45.636675 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.636587 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:45.737175 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.737142 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:45.816375 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.816347 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:45.831315 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.831291 2563 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 13:30:45.831459 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.831439 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 13:30:45.831499 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.831462 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 13:30:45.837451 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.837427 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-232.ec2.internal\" not found" Apr 20 13:30:45.851111 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.851086 2563 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:45.914365 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.914290 2563 apiserver.go:52] "Watching apiserver" Apr 20 13:30:45.926735 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.926682 2563 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 13:30:45.929358 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.929325 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xwkrd","openshift-multus/network-metrics-daemon-5pkrd","openshift-network-diagnostics/network-check-target-x6gsn","kube-system/konnectivity-agent-l7wx7","openshift-dns/node-resolver-rxqc7","openshift-image-registry/node-ca-hw898","openshift-multus/multus-bg8cz","openshift-network-operator/iptables-alerter-wdb75","openshift-ovn-kubernetes/ovnkube-node-mp62t","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6","openshift-cluster-node-tuning-operator/tuned-bvb2g"] Apr 20 13:30:45.932237 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.932217 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:45.933177 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.933153 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:45.933291 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.933193 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:45.933291 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.933250 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:30:45.933291 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:45.933250 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:30:45.934132 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.934115 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:30:45.934781 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.934622 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 13:30:45.934781 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.934623 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 13:30:45.934781 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.934673 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.934781 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.934688 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.934781 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.934722 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 13:30:45.935152 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.934956 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gchtz\"" Apr 20 13:30:45.935237 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.935205 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:45.936085 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.936065 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 13:30:45.936188 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.936175 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gdjnq\"" Apr 20 13:30:45.936351 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.936338 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 13:30:45.936423 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.936406 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:45.937336 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.937319 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.937411 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.937364 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.937411 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.937369 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d66cx\"" Apr 20 13:30:45.937571 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.937547 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.938293 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.938274 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 13:30:45.938563 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.938547 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.938651 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.938571 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-54ggv\"" Apr 20 13:30:45.938651 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.938629 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:45.938829 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.938673 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 13:30:45.938829 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.938748 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.939482 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.939453 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 13:30:45.939592 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.939536 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mdmhs\"" Apr 20 13:30:45.939674 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.939646 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.940000 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.939982 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.940863 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.940844 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.940950 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.940844 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-z5xx8\"" Apr 20 13:30:45.940950 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.940923 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 13:30:45.941063 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.940969 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.941605 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.941589 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:45.941700 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.941682 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 13:30:45.942147 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.942129 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-phdfg\"" Apr 20 13:30:45.942147 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.942130 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.942429 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.942411 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.942673 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.942652 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 13:30:45.942769 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.942706 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 13:30:45.942769 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.942716 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 13:30:45.943820 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.943789 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kmvbg\"" Apr 20 13:30:45.943911 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.943817 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 13:30:45.944197 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.944179 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.944675 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.945739 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqb4g\" (UniqueName: \"kubernetes.io/projected/f22ff795-52da-4095-9d35-f9d44f2b8239-kube-api-access-vqb4g\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.945785 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69f16c24-4d9a-4565-82a7-dbe15561755e-ovnkube-script-lib\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.945820 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pptlt\" (UniqueName: \"kubernetes.io/projected/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-kube-api-access-pptlt\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.945853 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/22136a88-f60c-4f20-8b96-f6af8da37f19-konnectivity-ca\") pod \"konnectivity-agent-l7wx7\" (UID: \"22136a88-f60c-4f20-8b96-f6af8da37f19\") " pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.945883 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-var-lib-cni-multus\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.945908 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-hostroot\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.945941 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-socket-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.945972 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f22ff795-52da-4095-9d35-f9d44f2b8239-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946005 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69f16c24-4d9a-4565-82a7-dbe15561755e-env-overrides\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946036 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-socket-dir-parent\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946087 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-node-log\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946148 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-run-multus-certs\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946221 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26f8\" (UniqueName: \"kubernetes.io/projected/99cf3e2c-0587-4d53-ae7a-4dfaea501010-kube-api-access-p26f8\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.946319 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946267 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-var-lib-openvswitch\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946342 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-run-openvswitch\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946439 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-run-k8s-cni-cncf-io\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946502 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4684c960-a06b-424c-815f-60be0a4e478f-iptables-alerter-script\") pod \"iptables-alerter-wdb75\" (UID: \"4684c960-a06b-424c-815f-60be0a4e478f\") " pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946539 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69f16c24-4d9a-4565-82a7-dbe15561755e-ovn-node-metrics-cert\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946566 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/22136a88-f60c-4f20-8b96-f6af8da37f19-agent-certs\") pod \"konnectivity-agent-l7wx7\" (UID: \"22136a88-f60c-4f20-8b96-f6af8da37f19\") " pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946591 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99cf3e2c-0587-4d53-ae7a-4dfaea501010-cni-binary-copy\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946611 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-slash\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946638 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-run-systemd\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946703 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946712 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-log-socket\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946744 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69f16c24-4d9a-4565-82a7-dbe15561755e-ovnkube-config\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946775 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f22ff795-52da-4095-9d35-f9d44f2b8239-cni-binary-copy\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946828 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1a9fc24-9a0e-4d24-aa45-ec1711e1399a-host\") pod \"node-ca-hw898\" (UID: \"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a\") " pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946852 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-cni-bin\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946883 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:45.946972 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946925 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-device-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.946987 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947078 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-system-cni-dir\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947108 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-daemon-config\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947132 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8ca98088-8b65-4efe-ad4e-3df5a8fe02b5-hosts-file\") pod \"node-resolver-rxqc7\" (UID: \"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5\") " pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947151 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-system-cni-dir\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947181 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-cnibin\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947227 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-run-ovn\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947258 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-cnibin\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947280 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-etc-kubernetes\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947298 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4684c960-a06b-424c-815f-60be0a4e478f-host-slash\") pod \"iptables-alerter-wdb75\" (UID: \"4684c960-a06b-424c-815f-60be0a4e478f\") " pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947318 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqplb\" (UniqueName: \"kubernetes.io/projected/4684c960-a06b-424c-815f-60be0a4e478f-kube-api-access-kqplb\") pod \"iptables-alerter-wdb75\" (UID: \"4684c960-a06b-424c-815f-60be0a4e478f\") " pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947337 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-os-release\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947395 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.947630 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947426 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-etc-openvswitch\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947446 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97nbv\" (UniqueName: \"kubernetes.io/projected/8ca98088-8b65-4efe-ad4e-3df5a8fe02b5-kube-api-access-97nbv\") pod \"node-resolver-rxqc7\" (UID: \"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5\") " pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947894 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqcvp\" (UniqueName: \"kubernetes.io/projected/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-kube-api-access-bqcvp\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947917 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2k6g\" (UniqueName: \"kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g\") pod \"network-check-target-x6gsn\" (UID: \"1d19bff6-e6ed-46e3-854b-04097f537694\") " pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947939 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-run-ovn-kubernetes\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947966 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-etc-selinux\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.947998 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f22ff795-52da-4095-9d35-f9d44f2b8239-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948029 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-kubelet\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948080 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-cni-netd\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948099 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-var-lib-cni-bin\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948118 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-sys-fs\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948184 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948227 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbd96\" (UniqueName: \"kubernetes.io/projected/b1a9fc24-9a0e-4d24-aa45-ec1711e1399a-kube-api-access-zbd96\") pod \"node-ca-hw898\" (UID: \"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a\") " pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:45.948288 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948265 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-systemd-units\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948317 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-var-lib-kubelet\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948346 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-conf-dir\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948802 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b1a9fc24-9a0e-4d24-aa45-ec1711e1399a-serviceca\") pod \"node-ca-hw898\" (UID: \"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a\") " pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948827 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-run-netns\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948868 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9c2b\" (UniqueName: \"kubernetes.io/projected/69f16c24-4d9a-4565-82a7-dbe15561755e-kube-api-access-t9c2b\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948893 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-registration-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948917 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-cni-dir\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948940 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-os-release\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948970 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-run-netns\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948976 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.948994 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ca98088-8b65-4efe-ad4e-3df5a8fe02b5-tmp-dir\") pod \"node-resolver-rxqc7\" (UID: \"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5\") " pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.949039 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 13:30:45.949087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.949041 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-9q7gl\"" Apr 20 13:30:45.951637 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.951620 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 13:30:45.951725 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.951688 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal" Apr 20 13:30:45.952111 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.952094 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal"] Apr 20 13:30:45.955200 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.955182 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 13:30:45.962230 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.962138 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 13:30:45.962230 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.962159 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal"] Apr 20 13:30:45.963857 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:45.963823 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ea90522398b66e089408acd5ec34cb0.slice/crio-96669d0262bd0a3814568cfdbb3bf42d5edf1d0a832f993559b9d68b0b417903 WatchSource:0}: Error finding container 96669d0262bd0a3814568cfdbb3bf42d5edf1d0a832f993559b9d68b0b417903: Status 404 returned error can't find the container with id 96669d0262bd0a3814568cfdbb3bf42d5edf1d0a832f993559b9d68b0b417903 Apr 20 13:30:45.964430 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:45.964411 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod410d242d04c4d94f2a1060285211a1a3.slice/crio-ab561aa0e24acf27ce2591b1bcb8c6d4b91c26f44754bbbee46e8c9726a91f4a WatchSource:0}: Error finding container ab561aa0e24acf27ce2591b1bcb8c6d4b91c26f44754bbbee46e8c9726a91f4a: Status 404 returned error can't find the container with id ab561aa0e24acf27ce2591b1bcb8c6d4b91c26f44754bbbee46e8c9726a91f4a Apr 20 13:30:45.969508 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.969493 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 13:30:45.978335 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.978317 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-g5w4p" Apr 20 13:30:45.985268 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:45.985248 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-g5w4p" Apr 20 13:30:46.031205 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.031172 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 13:25:45 +0000 UTC" deadline="2027-12-23 08:39:00.432393105 +0000 UTC" Apr 20 13:30:46.031205 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.031199 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14683h8m14.401195846s" Apr 20 13:30:46.040965 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.040944 2563 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 13:30:46.050089 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050064 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k6g\" (UniqueName: \"kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g\") pod \"network-check-target-x6gsn\" (UID: \"1d19bff6-e6ed-46e3-854b-04097f537694\") " pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:46.050216 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050115 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-run-ovn-kubernetes\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.050216 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050146 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-etc-selinux\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.050216 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050165 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f22ff795-52da-4095-9d35-f9d44f2b8239-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.050216 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050188 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-kubelet\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.050216 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050204 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-cni-netd\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050223 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-var-lib-cni-bin\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050237 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-run-ovn-kubernetes\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050252 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-modprobe-d\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050279 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-kubelet\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050287 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-cni-netd\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050303 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-sys-fs\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050310 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-var-lib-cni-bin\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050325 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-etc-selinux\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050335 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050370 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbd96\" (UniqueName: \"kubernetes.io/projected/b1a9fc24-9a0e-4d24-aa45-ec1711e1399a-kube-api-access-zbd96\") pod \"node-ca-hw898\" (UID: \"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a\") " pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050395 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-systemd-units\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050410 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-sys-fs\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050420 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-var-lib-kubelet\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050447 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-conf-dir\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050460 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-var-lib-kubelet\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050449 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-systemd-units\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.050471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050479 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050476 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-host\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050477 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-conf-dir\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050514 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b1a9fc24-9a0e-4d24-aa45-ec1711e1399a-serviceca\") pod \"node-ca-hw898\" (UID: \"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a\") " pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050541 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-run-netns\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050565 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9c2b\" (UniqueName: \"kubernetes.io/projected/69f16c24-4d9a-4565-82a7-dbe15561755e-kube-api-access-t9c2b\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050590 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-registration-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050923 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-cni-dir\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050976 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-os-release\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.051016 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-run-netns\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.051070 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ca98088-8b65-4efe-ad4e-3df5a8fe02b5-tmp-dir\") pod \"node-resolver-rxqc7\" (UID: \"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5\") " pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.051104 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqb4g\" (UniqueName: \"kubernetes.io/projected/f22ff795-52da-4095-9d35-f9d44f2b8239-kube-api-access-vqb4g\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.051160 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69f16c24-4d9a-4565-82a7-dbe15561755e-ovnkube-script-lib\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.051195 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f22ff795-52da-4095-9d35-f9d44f2b8239-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.051205 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b1a9fc24-9a0e-4d24-aa45-ec1711e1399a-serviceca\") pod \"node-ca-hw898\" (UID: \"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a\") " pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:46.051224 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.050594 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-run-netns\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.051886 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.051199 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pptlt\" (UniqueName: \"kubernetes.io/projected/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-kube-api-access-pptlt\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.051886 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.051293 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-os-release\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.051886 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.051339 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-run-netns\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.052126 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.051576 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-registration-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.052208 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052188 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8ca98088-8b65-4efe-ad4e-3df5a8fe02b5-tmp-dir\") pod \"node-resolver-rxqc7\" (UID: \"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5\") " pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:46.052287 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052214 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-cni-dir\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.052287 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052246 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/22136a88-f60c-4f20-8b96-f6af8da37f19-konnectivity-ca\") pod \"konnectivity-agent-l7wx7\" (UID: \"22136a88-f60c-4f20-8b96-f6af8da37f19\") " pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:30:46.052371 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052301 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-var-lib-cni-multus\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.052371 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052334 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-hostroot\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.052459 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052384 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-kubernetes\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.052459 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052415 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-socket-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.052459 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052454 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-sysconfig\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.052581 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052464 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/69f16c24-4d9a-4565-82a7-dbe15561755e-ovnkube-script-lib\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.052581 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052487 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-var-lib-kubelet\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.052581 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052544 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-tuned\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.052581 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052547 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-var-lib-cni-multus\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.052782 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052607 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-hostroot\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.052782 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052648 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f22ff795-52da-4095-9d35-f9d44f2b8239-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.052782 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052721 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69f16c24-4d9a-4565-82a7-dbe15561755e-env-overrides\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.052782 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052725 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-socket-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.052782 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052731 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/22136a88-f60c-4f20-8b96-f6af8da37f19-konnectivity-ca\") pod \"konnectivity-agent-l7wx7\" (UID: \"22136a88-f60c-4f20-8b96-f6af8da37f19\") " pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:30:46.052782 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052748 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-socket-dir-parent\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.053025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052817 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-socket-dir-parent\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.053025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052877 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-sysctl-conf\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.053025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052916 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-node-log\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.052950 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-run-multus-certs\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.053025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053002 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p26f8\" (UniqueName: \"kubernetes.io/projected/99cf3e2c-0587-4d53-ae7a-4dfaea501010-kube-api-access-p26f8\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.053263 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053031 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-var-lib-openvswitch\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053263 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053105 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-systemd\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.053263 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053135 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/69f16c24-4d9a-4565-82a7-dbe15561755e-env-overrides\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053263 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053137 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-run-openvswitch\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053263 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053205 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-run-k8s-cni-cncf-io\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.053263 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053194 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-run-openvswitch\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053263 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053242 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4684c960-a06b-424c-815f-60be0a4e478f-iptables-alerter-script\") pod \"iptables-alerter-wdb75\" (UID: \"4684c960-a06b-424c-815f-60be0a4e478f\") " pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:46.053263 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053257 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-run-multus-certs\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.053607 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053280 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-sysctl-d\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.053607 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053352 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-node-log\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053607 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053393 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f22ff795-52da-4095-9d35-f9d44f2b8239-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.053607 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053449 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69f16c24-4d9a-4565-82a7-dbe15561755e-ovn-node-metrics-cert\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053607 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053502 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/22136a88-f60c-4f20-8b96-f6af8da37f19-agent-certs\") pod \"konnectivity-agent-l7wx7\" (UID: \"22136a88-f60c-4f20-8b96-f6af8da37f19\") " pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:30:46.053607 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053553 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99cf3e2c-0587-4d53-ae7a-4dfaea501010-cni-binary-copy\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.053607 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053588 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-slash\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053896 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053679 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-host-run-k8s-cni-cncf-io\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.053896 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053722 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-run-systemd\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053896 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053756 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-log-socket\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053896 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053782 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69f16c24-4d9a-4565-82a7-dbe15561755e-ovnkube-config\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.053896 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053816 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9b89\" (UniqueName: \"kubernetes.io/projected/f4970e67-64f3-458f-91c6-003d6ca835f9-kube-api-access-m9b89\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.053896 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053849 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f22ff795-52da-4095-9d35-f9d44f2b8239-cni-binary-copy\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.053896 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053882 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1a9fc24-9a0e-4d24-aa45-ec1711e1399a-host\") pod \"node-ca-hw898\" (UID: \"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a\") " pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:46.054201 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053957 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-cni-bin\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.054201 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.053987 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.054201 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054020 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-device-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.054201 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054084 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:46.054201 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054115 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-system-cni-dir\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.054201 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054142 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-daemon-config\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.054449 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054208 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8ca98088-8b65-4efe-ad4e-3df5a8fe02b5-hosts-file\") pod \"node-resolver-rxqc7\" (UID: \"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5\") " pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:46.054449 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054237 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-system-cni-dir\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.054449 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054268 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-cnibin\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.054449 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054299 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-run-ovn\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.054449 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054328 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-cnibin\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.054449 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054356 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-etc-kubernetes\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.054721 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054497 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4684c960-a06b-424c-815f-60be0a4e478f-host-slash\") pod \"iptables-alerter-wdb75\" (UID: \"4684c960-a06b-424c-815f-60be0a4e478f\") " pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:46.054721 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054624 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8ca98088-8b65-4efe-ad4e-3df5a8fe02b5-hosts-file\") pod \"node-resolver-rxqc7\" (UID: \"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5\") " pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:46.054721 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.054696 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:46.054869 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054804 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.054869 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054843 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-cnibin\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.054970 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054892 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/4684c960-a06b-424c-815f-60be0a4e478f-iptables-alerter-script\") pod \"iptables-alerter-wdb75\" (UID: \"4684c960-a06b-424c-815f-60be0a4e478f\") " pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:46.054970 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054949 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-system-cni-dir\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.055088 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.054963 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-slash\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.055088 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.054990 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs podName:02041a2f-e9fd-4902-a9a4-47e4cd2889e4 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:46.554868685 +0000 UTC m=+2.118559954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs") pod "network-metrics-daemon-5pkrd" (UID: "02041a2f-e9fd-4902-a9a4-47e4cd2889e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:46.055088 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055006 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqplb\" (UniqueName: \"kubernetes.io/projected/4684c960-a06b-424c-815f-60be0a4e478f-kube-api-access-kqplb\") pod \"iptables-alerter-wdb75\" (UID: \"4684c960-a06b-424c-815f-60be0a4e478f\") " pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:46.055088 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055009 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99cf3e2c-0587-4d53-ae7a-4dfaea501010-etc-kubernetes\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.055088 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055009 2563 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055098 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-os-release\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055125 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055127 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f22ff795-52da-4095-9d35-f9d44f2b8239-cni-binary-copy\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055125 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-system-cni-dir\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055153 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-run\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055154 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1a9fc24-9a0e-4d24-aa45-ec1711e1399a-host\") pod \"node-ca-hw898\" (UID: \"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a\") " pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055175 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-sys\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055198 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-etc-openvswitch\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055258 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-cni-bin\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055280 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-device-dir\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055285 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-log-socket\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055294 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-os-release\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.055334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055310 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4684c960-a06b-424c-815f-60be0a4e478f-host-slash\") pod \"iptables-alerter-wdb75\" (UID: \"4684c960-a06b-424c-815f-60be0a4e478f\") " pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055343 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-run-systemd\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055367 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-etc-openvswitch\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055407 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055411 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-var-lib-openvswitch\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055409 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f22ff795-52da-4095-9d35-f9d44f2b8239-cnibin\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055529 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/69f16c24-4d9a-4565-82a7-dbe15561755e-run-ovn\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055567 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-lib-modules\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055605 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4970e67-64f3-458f-91c6-003d6ca835f9-tmp\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055609 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99cf3e2c-0587-4d53-ae7a-4dfaea501010-multus-daemon-config\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055633 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99cf3e2c-0587-4d53-ae7a-4dfaea501010-cni-binary-copy\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055640 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97nbv\" (UniqueName: \"kubernetes.io/projected/8ca98088-8b65-4efe-ad4e-3df5a8fe02b5-kube-api-access-97nbv\") pod \"node-resolver-rxqc7\" (UID: \"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5\") " pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:46.055955 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.055777 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqcvp\" (UniqueName: \"kubernetes.io/projected/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-kube-api-access-bqcvp\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:46.056482 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.056461 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/69f16c24-4d9a-4565-82a7-dbe15561755e-ovnkube-config\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.058447 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.058422 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/69f16c24-4d9a-4565-82a7-dbe15561755e-ovn-node-metrics-cert\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.058586 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.058567 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/22136a88-f60c-4f20-8b96-f6af8da37f19-agent-certs\") pod \"konnectivity-agent-l7wx7\" (UID: \"22136a88-f60c-4f20-8b96-f6af8da37f19\") " pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:30:46.062233 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.062215 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:30:46.062354 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.062243 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:30:46.062354 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.062256 2563 projected.go:194] Error preparing data for projected volume kube-api-access-m2k6g for pod openshift-network-diagnostics/network-check-target-x6gsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:46.062354 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.062323 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g podName:1d19bff6-e6ed-46e3-854b-04097f537694 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:46.562307124 +0000 UTC m=+2.125998390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-m2k6g" (UniqueName: "kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g") pod "network-check-target-x6gsn" (UID: "1d19bff6-e6ed-46e3-854b-04097f537694") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:46.064971 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.064929 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pptlt\" (UniqueName: \"kubernetes.io/projected/07b692b0-dcb9-4b70-b09f-4857c9ea8dc1-kube-api-access-pptlt\") pod \"aws-ebs-csi-driver-node-wrhs6\" (UID: \"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.065194 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.065144 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" event={"ID":"410d242d04c4d94f2a1060285211a1a3","Type":"ContainerStarted","Data":"ab561aa0e24acf27ce2591b1bcb8c6d4b91c26f44754bbbee46e8c9726a91f4a"} Apr 20 13:30:46.065431 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.065405 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26f8\" (UniqueName: \"kubernetes.io/projected/99cf3e2c-0587-4d53-ae7a-4dfaea501010-kube-api-access-p26f8\") pod \"multus-bg8cz\" (UID: \"99cf3e2c-0587-4d53-ae7a-4dfaea501010\") " pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.065557 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.065531 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqb4g\" (UniqueName: \"kubernetes.io/projected/f22ff795-52da-4095-9d35-f9d44f2b8239-kube-api-access-vqb4g\") pod \"multus-additional-cni-plugins-xwkrd\" (UID: \"f22ff795-52da-4095-9d35-f9d44f2b8239\") " pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.065618 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.065555 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqplb\" (UniqueName: \"kubernetes.io/projected/4684c960-a06b-424c-815f-60be0a4e478f-kube-api-access-kqplb\") pod \"iptables-alerter-wdb75\" (UID: \"4684c960-a06b-424c-815f-60be0a4e478f\") " pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:46.065672 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.065633 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9c2b\" (UniqueName: \"kubernetes.io/projected/69f16c24-4d9a-4565-82a7-dbe15561755e-kube-api-access-t9c2b\") pod \"ovnkube-node-mp62t\" (UID: \"69f16c24-4d9a-4565-82a7-dbe15561755e\") " pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.066162 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.066141 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbd96\" (UniqueName: \"kubernetes.io/projected/b1a9fc24-9a0e-4d24-aa45-ec1711e1399a-kube-api-access-zbd96\") pod \"node-ca-hw898\" (UID: \"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a\") " pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:46.066739 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.066717 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal" event={"ID":"0ea90522398b66e089408acd5ec34cb0","Type":"ContainerStarted","Data":"96669d0262bd0a3814568cfdbb3bf42d5edf1d0a832f993559b9d68b0b417903"} Apr 20 13:30:46.067013 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.066998 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqcvp\" (UniqueName: \"kubernetes.io/projected/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-kube-api-access-bqcvp\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:46.067469 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.067454 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97nbv\" (UniqueName: \"kubernetes.io/projected/8ca98088-8b65-4efe-ad4e-3df5a8fe02b5-kube-api-access-97nbv\") pod \"node-resolver-rxqc7\" (UID: \"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5\") " pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:46.156922 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.156889 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-systemd\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.156922 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.156921 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-sysctl-d\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.156940 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9b89\" (UniqueName: \"kubernetes.io/projected/f4970e67-64f3-458f-91c6-003d6ca835f9-kube-api-access-m9b89\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.156982 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-run\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157001 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-sys\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157010 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-systemd\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157021 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-lib-modules\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157084 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4970e67-64f3-458f-91c6-003d6ca835f9-tmp\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157119 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-run\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157130 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-lib-modules\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157122 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-sysctl-d\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157132 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-modprobe-d\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157155 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157119 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-sys\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157213 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-modprobe-d\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157230 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-host\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157264 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-kubernetes\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157303 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-kubernetes\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157314 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-host\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157324 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-sysconfig\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157348 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-var-lib-kubelet\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157371 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-tuned\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157376 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-sysconfig\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157398 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-sysctl-conf\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157447 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-var-lib-kubelet\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.157542 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.157527 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-sysctl-conf\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.159340 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.159323 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f4970e67-64f3-458f-91c6-003d6ca835f9-tmp\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.159442 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.159423 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f4970e67-64f3-458f-91c6-003d6ca835f9-etc-tuned\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.167623 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.167564 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9b89\" (UniqueName: \"kubernetes.io/projected/f4970e67-64f3-458f-91c6-003d6ca835f9-kube-api-access-m9b89\") pod \"tuned-bvb2g\" (UID: \"f4970e67-64f3-458f-91c6-003d6ca835f9\") " pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.270939 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.270902 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" Apr 20 13:30:46.276756 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.276732 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:30:46.279033 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:46.279004 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf22ff795_52da_4095_9d35_f9d44f2b8239.slice/crio-566c9f835db8dd92a8afbf5bb37216e5f537f02bd4f4a3a05eb58d57aac10989 WatchSource:0}: Error finding container 566c9f835db8dd92a8afbf5bb37216e5f537f02bd4f4a3a05eb58d57aac10989: Status 404 returned error can't find the container with id 566c9f835db8dd92a8afbf5bb37216e5f537f02bd4f4a3a05eb58d57aac10989 Apr 20 13:30:46.283670 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.283648 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rxqc7" Apr 20 13:30:46.284768 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:46.284742 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22136a88_f60c_4f20_8b96_f6af8da37f19.slice/crio-aa7435fafd75a75054c4bde254979f8900cee80c7b67f24a557e8793172f90e9 WatchSource:0}: Error finding container aa7435fafd75a75054c4bde254979f8900cee80c7b67f24a557e8793172f90e9: Status 404 returned error can't find the container with id aa7435fafd75a75054c4bde254979f8900cee80c7b67f24a557e8793172f90e9 Apr 20 13:30:46.288915 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.288830 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hw898" Apr 20 13:30:46.292142 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:46.292103 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ca98088_8b65_4efe_ad4e_3df5a8fe02b5.slice/crio-7557a8eb6778c4eeee67d405a11c847bbfd994fa5bb128d138870007ed3740a0 WatchSource:0}: Error finding container 7557a8eb6778c4eeee67d405a11c847bbfd994fa5bb128d138870007ed3740a0: Status 404 returned error can't find the container with id 7557a8eb6778c4eeee67d405a11c847bbfd994fa5bb128d138870007ed3740a0 Apr 20 13:30:46.293449 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.293375 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bg8cz" Apr 20 13:30:46.298826 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:46.298790 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1a9fc24_9a0e_4d24_aa45_ec1711e1399a.slice/crio-115f2dcc94e0e31879c9db8c491557d1150768c94611f5d90dfb3e7cb6118f44 WatchSource:0}: Error finding container 115f2dcc94e0e31879c9db8c491557d1150768c94611f5d90dfb3e7cb6118f44: Status 404 returned error can't find the container with id 115f2dcc94e0e31879c9db8c491557d1150768c94611f5d90dfb3e7cb6118f44 Apr 20 13:30:46.299952 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.299534 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-wdb75" Apr 20 13:30:46.302706 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:46.302687 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99cf3e2c_0587_4d53_ae7a_4dfaea501010.slice/crio-c8ac5640acf68fc92fcea6b4b749892ad441a7f4a1a2c45cdc3a7bba09ad6fa5 WatchSource:0}: Error finding container c8ac5640acf68fc92fcea6b4b749892ad441a7f4a1a2c45cdc3a7bba09ad6fa5: Status 404 returned error can't find the container with id c8ac5640acf68fc92fcea6b4b749892ad441a7f4a1a2c45cdc3a7bba09ad6fa5 Apr 20 13:30:46.304499 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.304439 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:30:46.309326 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:46.309301 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4684c960_a06b_424c_815f_60be0a4e478f.slice/crio-1c05d13a25190ec8f1d2a16aafb9a6abfecd7705f6e43fd9b0bf7801179e2411 WatchSource:0}: Error finding container 1c05d13a25190ec8f1d2a16aafb9a6abfecd7705f6e43fd9b0bf7801179e2411: Status 404 returned error can't find the container with id 1c05d13a25190ec8f1d2a16aafb9a6abfecd7705f6e43fd9b0bf7801179e2411 Apr 20 13:30:46.310208 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.310186 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" Apr 20 13:30:46.311995 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:46.311969 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69f16c24_4d9a_4565_82a7_dbe15561755e.slice/crio-67785dddec8aeaa1b8aae6e3f2d94e8e23bc86599e41fcceaebecf4949284f9e WatchSource:0}: Error finding container 67785dddec8aeaa1b8aae6e3f2d94e8e23bc86599e41fcceaebecf4949284f9e: Status 404 returned error can't find the container with id 67785dddec8aeaa1b8aae6e3f2d94e8e23bc86599e41fcceaebecf4949284f9e Apr 20 13:30:46.315353 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.315334 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" Apr 20 13:30:46.321243 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:46.321215 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b692b0_dcb9_4b70_b09f_4857c9ea8dc1.slice/crio-b38dcdf4ad5e66a20de6a715c8a0882b6a069e22c4d4b2822ac5cf608677533c WatchSource:0}: Error finding container b38dcdf4ad5e66a20de6a715c8a0882b6a069e22c4d4b2822ac5cf608677533c: Status 404 returned error can't find the container with id b38dcdf4ad5e66a20de6a715c8a0882b6a069e22c4d4b2822ac5cf608677533c Apr 20 13:30:46.323792 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:30:46.323768 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4970e67_64f3_458f_91c6_003d6ca835f9.slice/crio-77866bc5b35ed1677abd6e21153e3889c8ebcafba6cd92af0259778a4e99b6ed WatchSource:0}: Error finding container 77866bc5b35ed1677abd6e21153e3889c8ebcafba6cd92af0259778a4e99b6ed: Status 404 returned error can't find the container with id 77866bc5b35ed1677abd6e21153e3889c8ebcafba6cd92af0259778a4e99b6ed Apr 20 13:30:46.488586 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.488204 2563 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:46.560235 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.560193 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:46.560439 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.560413 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:46.560502 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.560478 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs podName:02041a2f-e9fd-4902-a9a4-47e4cd2889e4 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:47.560459792 +0000 UTC m=+3.124151043 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs") pod "network-metrics-daemon-5pkrd" (UID: "02041a2f-e9fd-4902-a9a4-47e4cd2889e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:46.661626 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.661585 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k6g\" (UniqueName: \"kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g\") pod \"network-check-target-x6gsn\" (UID: \"1d19bff6-e6ed-46e3-854b-04097f537694\") " pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:46.661802 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.661787 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:30:46.661881 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.661811 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:30:46.661881 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.661825 2563 projected.go:194] Error preparing data for projected volume kube-api-access-m2k6g for pod openshift-network-diagnostics/network-check-target-x6gsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:46.661994 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:46.661887 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g podName:1d19bff6-e6ed-46e3-854b-04097f537694 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:47.661867419 +0000 UTC m=+3.225558679 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-m2k6g" (UniqueName: "kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g") pod "network-check-target-x6gsn" (UID: "1d19bff6-e6ed-46e3-854b-04097f537694") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:46.986681 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.986635 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 13:25:45 +0000 UTC" deadline="2027-10-11 02:24:10.561296228 +0000 UTC" Apr 20 13:30:46.986681 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:46.986679 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12924h53m23.57462217s" Apr 20 13:30:47.006590 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.006537 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:47.079453 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.079385 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" event={"ID":"f4970e67-64f3-458f-91c6-003d6ca835f9","Type":"ContainerStarted","Data":"77866bc5b35ed1677abd6e21153e3889c8ebcafba6cd92af0259778a4e99b6ed"} Apr 20 13:30:47.090584 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.090255 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" event={"ID":"69f16c24-4d9a-4565-82a7-dbe15561755e","Type":"ContainerStarted","Data":"67785dddec8aeaa1b8aae6e3f2d94e8e23bc86599e41fcceaebecf4949284f9e"} Apr 20 13:30:47.097505 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.097441 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rxqc7" event={"ID":"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5","Type":"ContainerStarted","Data":"7557a8eb6778c4eeee67d405a11c847bbfd994fa5bb128d138870007ed3740a0"} Apr 20 13:30:47.100878 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.100810 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l7wx7" event={"ID":"22136a88-f60c-4f20-8b96-f6af8da37f19","Type":"ContainerStarted","Data":"aa7435fafd75a75054c4bde254979f8900cee80c7b67f24a557e8793172f90e9"} Apr 20 13:30:47.103504 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.103429 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" event={"ID":"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1","Type":"ContainerStarted","Data":"b38dcdf4ad5e66a20de6a715c8a0882b6a069e22c4d4b2822ac5cf608677533c"} Apr 20 13:30:47.108289 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.108209 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wdb75" event={"ID":"4684c960-a06b-424c-815f-60be0a4e478f","Type":"ContainerStarted","Data":"1c05d13a25190ec8f1d2a16aafb9a6abfecd7705f6e43fd9b0bf7801179e2411"} Apr 20 13:30:47.117022 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.116981 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bg8cz" event={"ID":"99cf3e2c-0587-4d53-ae7a-4dfaea501010","Type":"ContainerStarted","Data":"c8ac5640acf68fc92fcea6b4b749892ad441a7f4a1a2c45cdc3a7bba09ad6fa5"} Apr 20 13:30:47.132405 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.132329 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hw898" event={"ID":"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a","Type":"ContainerStarted","Data":"115f2dcc94e0e31879c9db8c491557d1150768c94611f5d90dfb3e7cb6118f44"} Apr 20 13:30:47.141545 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.141511 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" event={"ID":"f22ff795-52da-4095-9d35-f9d44f2b8239","Type":"ContainerStarted","Data":"566c9f835db8dd92a8afbf5bb37216e5f537f02bd4f4a3a05eb58d57aac10989"} Apr 20 13:30:47.574596 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.574551 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:47.574787 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:47.574764 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:47.574856 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:47.574839 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs podName:02041a2f-e9fd-4902-a9a4-47e4cd2889e4 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:49.574812903 +0000 UTC m=+5.138504157 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs") pod "network-metrics-daemon-5pkrd" (UID: "02041a2f-e9fd-4902-a9a4-47e4cd2889e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:47.676000 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.675956 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k6g\" (UniqueName: \"kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g\") pod \"network-check-target-x6gsn\" (UID: \"1d19bff6-e6ed-46e3-854b-04097f537694\") " pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:47.676248 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:47.676229 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:30:47.676312 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:47.676256 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:30:47.676312 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:47.676302 2563 projected.go:194] Error preparing data for projected volume kube-api-access-m2k6g for pod openshift-network-diagnostics/network-check-target-x6gsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:47.676413 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:47.676364 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g podName:1d19bff6-e6ed-46e3-854b-04097f537694 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:49.676344896 +0000 UTC m=+5.240036150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-m2k6g" (UniqueName: "kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g") pod "network-check-target-x6gsn" (UID: "1d19bff6-e6ed-46e3-854b-04097f537694") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:47.817999 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.817964 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 13:30:47.987386 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.987227 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 13:25:45 +0000 UTC" deadline="2028-01-26 12:22:53.747053396 +0000 UTC" Apr 20 13:30:47.987386 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:47.987267 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15502h52m5.759789771s" Apr 20 13:30:48.062097 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:48.061477 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:48.062097 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:48.061623 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:30:48.062522 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:48.062366 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:48.062522 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:48.062483 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:30:49.596121 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:49.595707 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:49.596121 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:49.595887 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:49.596121 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:49.595951 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs podName:02041a2f-e9fd-4902-a9a4-47e4cd2889e4 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:53.595930577 +0000 UTC m=+9.159621828 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs") pod "network-metrics-daemon-5pkrd" (UID: "02041a2f-e9fd-4902-a9a4-47e4cd2889e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:49.697753 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:49.697085 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k6g\" (UniqueName: \"kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g\") pod \"network-check-target-x6gsn\" (UID: \"1d19bff6-e6ed-46e3-854b-04097f537694\") " pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:49.697753 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:49.697292 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:30:49.697753 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:49.697313 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:30:49.697753 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:49.697326 2563 projected.go:194] Error preparing data for projected volume kube-api-access-m2k6g for pod openshift-network-diagnostics/network-check-target-x6gsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:49.697753 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:49.697388 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g podName:1d19bff6-e6ed-46e3-854b-04097f537694 nodeName:}" failed. No retries permitted until 2026-04-20 13:30:53.697368305 +0000 UTC m=+9.261059555 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-m2k6g" (UniqueName: "kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g") pod "network-check-target-x6gsn" (UID: "1d19bff6-e6ed-46e3-854b-04097f537694") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:50.062111 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:50.062077 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:50.062300 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:50.062207 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:30:50.062632 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:50.062601 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:50.062735 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:50.062703 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:30:52.061486 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:52.061453 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:52.061918 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:52.061594 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:30:52.061918 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:52.061453 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:52.062005 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:52.061957 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:30:53.629652 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:53.629610 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:53.630123 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:53.629798 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:53.630123 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:53.629969 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs podName:02041a2f-e9fd-4902-a9a4-47e4cd2889e4 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:01.629945264 +0000 UTC m=+17.193636529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs") pod "network-metrics-daemon-5pkrd" (UID: "02041a2f-e9fd-4902-a9a4-47e4cd2889e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:30:53.730992 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:53.730380 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k6g\" (UniqueName: \"kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g\") pod \"network-check-target-x6gsn\" (UID: \"1d19bff6-e6ed-46e3-854b-04097f537694\") " pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:53.730992 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:53.730514 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:30:53.730992 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:53.730538 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:30:53.730992 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:53.730552 2563 projected.go:194] Error preparing data for projected volume kube-api-access-m2k6g for pod openshift-network-diagnostics/network-check-target-x6gsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:53.730992 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:53.730620 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g podName:1d19bff6-e6ed-46e3-854b-04097f537694 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:01.730600252 +0000 UTC m=+17.294291501 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-m2k6g" (UniqueName: "kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g") pod "network-check-target-x6gsn" (UID: "1d19bff6-e6ed-46e3-854b-04097f537694") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:30:54.062064 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:54.062016 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:54.062263 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:54.062133 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:30:54.062526 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:54.062508 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:54.062626 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:54.062609 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:30:56.062265 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:56.062223 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:56.062697 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:56.062229 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:56.062697 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:56.062373 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:30:56.062697 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:56.062444 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:30:58.061734 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:58.061699 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:30:58.062301 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:30:58.061699 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:30:58.062301 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:58.061829 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:30:58.062301 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:30:58.061918 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:00.061833 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:00.061798 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:00.062212 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:00.061924 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:00.062212 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:00.061957 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:00.062212 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:00.062038 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:31:01.689545 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:01.689272 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:01.690025 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:01.689448 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:01.690025 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:01.689694 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs podName:02041a2f-e9fd-4902-a9a4-47e4cd2889e4 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.689675582 +0000 UTC m=+33.253366836 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs") pod "network-metrics-daemon-5pkrd" (UID: "02041a2f-e9fd-4902-a9a4-47e4cd2889e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:01.790938 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:01.790897 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k6g\" (UniqueName: \"kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g\") pod \"network-check-target-x6gsn\" (UID: \"1d19bff6-e6ed-46e3-854b-04097f537694\") " pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:01.791138 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:01.791068 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:31:01.791138 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:01.791086 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:31:01.791138 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:01.791096 2563 projected.go:194] Error preparing data for projected volume kube-api-access-m2k6g for pod openshift-network-diagnostics/network-check-target-x6gsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:01.791291 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:01.791147 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g podName:1d19bff6-e6ed-46e3-854b-04097f537694 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.791133724 +0000 UTC m=+33.354824972 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-m2k6g" (UniqueName: "kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g") pod "network-check-target-x6gsn" (UID: "1d19bff6-e6ed-46e3-854b-04097f537694") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:02.062192 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:02.062154 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:02.062373 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:02.062152 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:02.062373 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:02.062281 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:31:02.062373 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:02.062335 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:04.062153 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:04.062119 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:04.062618 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:04.062121 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:04.062618 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:04.062224 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:04.062618 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:04.062347 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:31:05.191537 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.190258 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal" event={"ID":"0ea90522398b66e089408acd5ec34cb0","Type":"ContainerStarted","Data":"4b6ac2e459be11901fe8df78b591410d4a031fc73fbf413b191a8884d76fae02"} Apr 20 13:31:05.195303 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.195269 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" event={"ID":"f4970e67-64f3-458f-91c6-003d6ca835f9","Type":"ContainerStarted","Data":"d743b1e19830197e3c090f674edb5a9d4462b9e69c1835c4e8721e237c5ab35b"} Apr 20 13:31:05.205892 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.205847 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-232.ec2.internal" podStartSLOduration=20.205829331 podStartE2EDuration="20.205829331s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:31:05.205575758 +0000 UTC m=+20.769267031" watchObservedRunningTime="2026-04-20 13:31:05.205829331 +0000 UTC m=+20.769520602" Apr 20 13:31:05.210211 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.210117 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:31:05.210742 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.210649 2563 generic.go:358] "Generic (PLEG): container finished" podID="69f16c24-4d9a-4565-82a7-dbe15561755e" containerID="30a002e1a677abb8b34e2d6820ff8bd32717534c317ca697cc8855abdd7a47de" exitCode=1 Apr 20 13:31:05.210808 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.210742 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" event={"ID":"69f16c24-4d9a-4565-82a7-dbe15561755e","Type":"ContainerStarted","Data":"b4a21dd65d1b1fb68dbde0bddc259115d9153dbbad7348104bf14689e2d3fd3d"} Apr 20 13:31:05.210808 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.210767 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" event={"ID":"69f16c24-4d9a-4565-82a7-dbe15561755e","Type":"ContainerStarted","Data":"dc5770582f074d29fb0ae7b04ad0a7ffef7956ae0cfb68a5e4b0981e1ea17c73"} Apr 20 13:31:05.210808 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.210782 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" event={"ID":"69f16c24-4d9a-4565-82a7-dbe15561755e","Type":"ContainerDied","Data":"30a002e1a677abb8b34e2d6820ff8bd32717534c317ca697cc8855abdd7a47de"} Apr 20 13:31:05.210808 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.210799 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" event={"ID":"69f16c24-4d9a-4565-82a7-dbe15561755e","Type":"ContainerStarted","Data":"3e365b6b700e268e05e6e8a07e66657e0373356d82967ca769a82fb3636b9d93"} Apr 20 13:31:05.216397 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.215943 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bg8cz" event={"ID":"99cf3e2c-0587-4d53-ae7a-4dfaea501010","Type":"ContainerStarted","Data":"c20db7625ad51d27665a45af01da473ef8c4c6e502c3e7bf074600cc2d18956b"} Apr 20 13:31:05.220214 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.220089 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bvb2g" podStartSLOduration=2.005211102 podStartE2EDuration="20.220027095s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.326737618 +0000 UTC m=+1.890428879" lastFinishedPulling="2026-04-20 13:31:04.541553609 +0000 UTC m=+20.105244872" observedRunningTime="2026-04-20 13:31:05.219421713 +0000 UTC m=+20.783112982" watchObservedRunningTime="2026-04-20 13:31:05.220027095 +0000 UTC m=+20.783718368" Apr 20 13:31:05.237950 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:05.237906 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bg8cz" podStartSLOduration=1.960192477 podStartE2EDuration="20.237893258s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.305422966 +0000 UTC m=+1.869114220" lastFinishedPulling="2026-04-20 13:31:04.583123739 +0000 UTC m=+20.146815001" observedRunningTime="2026-04-20 13:31:05.237427061 +0000 UTC m=+20.801118331" watchObservedRunningTime="2026-04-20 13:31:05.237893258 +0000 UTC m=+20.801584529" Apr 20 13:31:06.061744 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.061513 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:06.061900 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.061513 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:06.061900 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:06.061871 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:06.062069 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:06.062004 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:31:06.220142 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.220105 2563 generic.go:358] "Generic (PLEG): container finished" podID="f22ff795-52da-4095-9d35-f9d44f2b8239" containerID="fbb6ca1f6b9bef8d13e8a2b01fe1dbb33527bfea844d4bb980ffd53a38056388" exitCode=0 Apr 20 13:31:06.220577 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.220209 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" event={"ID":"f22ff795-52da-4095-9d35-f9d44f2b8239","Type":"ContainerDied","Data":"fbb6ca1f6b9bef8d13e8a2b01fe1dbb33527bfea844d4bb980ffd53a38056388"} Apr 20 13:31:06.223200 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.223184 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:31:06.223578 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.223552 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" event={"ID":"69f16c24-4d9a-4565-82a7-dbe15561755e","Type":"ContainerStarted","Data":"3adec4627f57c7e2681667eb3e65cead6a6f1c0df6b16ca7d0fa46191672ae9f"} Apr 20 13:31:06.223661 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.223591 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" event={"ID":"69f16c24-4d9a-4565-82a7-dbe15561755e","Type":"ContainerStarted","Data":"774db94bcf954b8363a4bba7b502b9feaea940f0daa0fd400a2297c3357bca37"} Apr 20 13:31:06.224976 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.224940 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rxqc7" event={"ID":"8ca98088-8b65-4efe-ad4e-3df5a8fe02b5","Type":"ContainerStarted","Data":"49f6b0f53d0a00fa444c6ac013690feac43e01fa911617285bbfce584e3e6427"} Apr 20 13:31:06.226293 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.226273 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l7wx7" event={"ID":"22136a88-f60c-4f20-8b96-f6af8da37f19","Type":"ContainerStarted","Data":"1da1061441f3c9e4ae8edaacb569a990cff3eb159601c8fa2397e4387cc75ade"} Apr 20 13:31:06.227757 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.227734 2563 generic.go:358] "Generic (PLEG): container finished" podID="410d242d04c4d94f2a1060285211a1a3" containerID="b1395dad3100cc3ee562bde8b5df1ecfea2de60d2901a913b9b54b3282b5a4fd" exitCode=0 Apr 20 13:31:06.227856 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.227805 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" event={"ID":"410d242d04c4d94f2a1060285211a1a3","Type":"ContainerDied","Data":"b1395dad3100cc3ee562bde8b5df1ecfea2de60d2901a913b9b54b3282b5a4fd"} Apr 20 13:31:06.229299 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.229274 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" event={"ID":"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1","Type":"ContainerStarted","Data":"13bcf3ae00c20276263504732103ae9b088cad289047a40a0545bcc7f402d6f9"} Apr 20 13:31:06.230640 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.230612 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-wdb75" event={"ID":"4684c960-a06b-424c-815f-60be0a4e478f","Type":"ContainerStarted","Data":"7a8756a2b380089df549745e6a48eaf72d6401bdd39081f8670bfb2c0b9eb5df"} Apr 20 13:31:06.231999 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.231970 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hw898" event={"ID":"b1a9fc24-9a0e-4d24-aa45-ec1711e1399a","Type":"ContainerStarted","Data":"7ce363674a3bd5db58e865715a5985b801f649afa372875d543b66878967c5cb"} Apr 20 13:31:06.257885 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.257835 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hw898" podStartSLOduration=3.006883096 podStartE2EDuration="21.257815616s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.301502024 +0000 UTC m=+1.865193273" lastFinishedPulling="2026-04-20 13:31:04.55243453 +0000 UTC m=+20.116125793" observedRunningTime="2026-04-20 13:31:06.257279573 +0000 UTC m=+21.820970843" watchObservedRunningTime="2026-04-20 13:31:06.257815616 +0000 UTC m=+21.821506888" Apr 20 13:31:06.283733 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.283689 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rxqc7" podStartSLOduration=3.025206262 podStartE2EDuration="21.283674099s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.293960939 +0000 UTC m=+1.857652192" lastFinishedPulling="2026-04-20 13:31:04.552428779 +0000 UTC m=+20.116120029" observedRunningTime="2026-04-20 13:31:06.283344437 +0000 UTC m=+21.847035707" watchObservedRunningTime="2026-04-20 13:31:06.283674099 +0000 UTC m=+21.847365369" Apr 20 13:31:06.296951 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.296908 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-wdb75" podStartSLOduration=3.066435339 podStartE2EDuration="21.296894854s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.311110153 +0000 UTC m=+1.874801412" lastFinishedPulling="2026-04-20 13:31:04.541569677 +0000 UTC m=+20.105260927" observedRunningTime="2026-04-20 13:31:06.296647536 +0000 UTC m=+21.860338810" watchObservedRunningTime="2026-04-20 13:31:06.296894854 +0000 UTC m=+21.860586124" Apr 20 13:31:06.310843 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.310623 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-l7wx7" podStartSLOduration=3.04561859 podStartE2EDuration="21.31059587s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.28751055 +0000 UTC m=+1.851201802" lastFinishedPulling="2026-04-20 13:31:04.552487834 +0000 UTC m=+20.116179082" observedRunningTime="2026-04-20 13:31:06.309877114 +0000 UTC m=+21.873568384" watchObservedRunningTime="2026-04-20 13:31:06.31059587 +0000 UTC m=+21.874287142" Apr 20 13:31:06.475822 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:06.475797 2563 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 13:31:07.020754 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:07.020651 2563 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T13:31:06.475818232Z","UUID":"ca488f2f-da09-41f7-8aa0-4737d47cc094","Handler":null,"Name":"","Endpoint":""} Apr 20 13:31:07.023858 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:07.023827 2563 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 13:31:07.023858 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:07.023858 2563 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 13:31:07.235249 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:07.235220 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" event={"ID":"410d242d04c4d94f2a1060285211a1a3","Type":"ContainerStarted","Data":"edb197d05ce52e8ab5be6f3eeb849a349dc5cd245696fa6b0689a06b58af8472"} Apr 20 13:31:07.237142 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:07.237111 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" event={"ID":"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1","Type":"ContainerStarted","Data":"916f87416af5858cf87af6ab5862ad5f6e25ec085da5dc673f4050ffea5d23ba"} Apr 20 13:31:07.546853 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:07.546809 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:31:07.547492 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:07.547469 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:31:07.571643 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:07.571596 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-232.ec2.internal" podStartSLOduration=22.571579183 podStartE2EDuration="22.571579183s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:31:07.263657605 +0000 UTC m=+22.827348871" watchObservedRunningTime="2026-04-20 13:31:07.571579183 +0000 UTC m=+23.135270452" Apr 20 13:31:08.062185 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:08.062149 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:08.062380 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:08.062149 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:08.062380 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:08.062283 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:31:08.062483 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:08.062370 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:08.242860 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:08.242827 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:31:08.243353 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:08.243332 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" event={"ID":"69f16c24-4d9a-4565-82a7-dbe15561755e","Type":"ContainerStarted","Data":"9c170b2202b1ed2bdab619db7c28177c9f1353f725ca0fd11e94ffd10835e3cf"} Apr 20 13:31:08.245473 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:08.245446 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" event={"ID":"07b692b0-dcb9-4b70-b09f-4857c9ea8dc1","Type":"ContainerStarted","Data":"440c6d9ec9b3cd013f894de46f6b86affba607713b4b9f95f04075609c06df6f"} Apr 20 13:31:09.247692 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:09.247483 2563 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 13:31:10.062231 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:10.062190 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:10.062417 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:10.062202 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:10.062417 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:10.062309 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:10.062417 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:10.062384 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:31:10.253249 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:10.253122 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:31:10.254670 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:10.253556 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" event={"ID":"69f16c24-4d9a-4565-82a7-dbe15561755e","Type":"ContainerStarted","Data":"82023af3d49c26b2a2b35eed51c80be36b72e9db217dcae10b452c13a4ae26ae"} Apr 20 13:31:10.254670 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:10.254007 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:31:10.254670 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:10.254252 2563 scope.go:117] "RemoveContainer" containerID="30a002e1a677abb8b34e2d6820ff8bd32717534c317ca697cc8855abdd7a47de" Apr 20 13:31:10.276282 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:10.276137 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:31:10.290031 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:10.289989 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wrhs6" podStartSLOduration=4.339708829 podStartE2EDuration="25.289975272s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.3239364 +0000 UTC m=+1.887627662" lastFinishedPulling="2026-04-20 13:31:07.274202855 +0000 UTC m=+22.837894105" observedRunningTime="2026-04-20 13:31:08.265821022 +0000 UTC m=+23.829512294" watchObservedRunningTime="2026-04-20 13:31:10.289975272 +0000 UTC m=+25.853666572" Apr 20 13:31:11.257327 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:11.257294 2563 generic.go:358] "Generic (PLEG): container finished" podID="f22ff795-52da-4095-9d35-f9d44f2b8239" containerID="c52f1402c62537e0e1889f336fe48f946aebe17e2f80a273bcc570e5a121ca8b" exitCode=0 Apr 20 13:31:11.257754 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:11.257366 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" event={"ID":"f22ff795-52da-4095-9d35-f9d44f2b8239","Type":"ContainerDied","Data":"c52f1402c62537e0e1889f336fe48f946aebe17e2f80a273bcc570e5a121ca8b"} Apr 20 13:31:11.260917 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:11.260896 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:31:11.261266 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:11.261237 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" event={"ID":"69f16c24-4d9a-4565-82a7-dbe15561755e","Type":"ContainerStarted","Data":"5be78a29625ba4a2005ea390fe4c0bf87bc9a0db19ed59d66e20b01aad604d77"} Apr 20 13:31:11.261503 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:11.261483 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:31:11.261604 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:11.261510 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:31:11.276601 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:11.276577 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:31:12.061853 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:12.061826 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:12.061964 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:12.061828 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:12.062027 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:12.061959 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:31:12.062027 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:12.062002 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:12.109830 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:12.109781 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" podStartSLOduration=8.524545967 podStartE2EDuration="27.109763312s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.3138654 +0000 UTC m=+1.877556651" lastFinishedPulling="2026-04-20 13:31:04.899082729 +0000 UTC m=+20.462773996" observedRunningTime="2026-04-20 13:31:11.307242669 +0000 UTC m=+26.870933939" watchObservedRunningTime="2026-04-20 13:31:12.109763312 +0000 UTC m=+27.673454582" Apr 20 13:31:12.110019 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:12.109993 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5pkrd"] Apr 20 13:31:12.110572 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:12.110548 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x6gsn"] Apr 20 13:31:12.265292 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:12.265203 2563 generic.go:358] "Generic (PLEG): container finished" podID="f22ff795-52da-4095-9d35-f9d44f2b8239" containerID="01e74e804bdda1080b2aa8fdf768e8f164f7be98541fc70fef06c08d1b64fc14" exitCode=0 Apr 20 13:31:12.265292 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:12.265281 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:12.265736 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:12.265324 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" event={"ID":"f22ff795-52da-4095-9d35-f9d44f2b8239","Type":"ContainerDied","Data":"01e74e804bdda1080b2aa8fdf768e8f164f7be98541fc70fef06c08d1b64fc14"} Apr 20 13:31:12.265736 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:12.265367 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:12.265736 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:12.265515 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:12.265736 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:12.265638 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:31:13.269222 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:13.268966 2563 generic.go:358] "Generic (PLEG): container finished" podID="f22ff795-52da-4095-9d35-f9d44f2b8239" containerID="81e4342d47ff6d5f1be8ca92db966fef97ea128e3e2654d80fcf200ff6d77507" exitCode=0 Apr 20 13:31:13.269222 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:13.269076 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" event={"ID":"f22ff795-52da-4095-9d35-f9d44f2b8239","Type":"ContainerDied","Data":"81e4342d47ff6d5f1be8ca92db966fef97ea128e3e2654d80fcf200ff6d77507"} Apr 20 13:31:14.061766 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:14.061736 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:14.061941 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:14.061906 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:14.062112 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:14.062090 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:14.062208 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:14.062187 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:31:14.468587 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:14.468485 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:31:14.469029 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:14.468680 2563 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 13:31:14.469365 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:14.469342 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-l7wx7" Apr 20 13:31:15.659967 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.659929 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ksm7s"] Apr 20 13:31:15.683172 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.683142 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ksm7s"] Apr 20 13:31:15.683336 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.683315 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:15.683522 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:15.683412 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksm7s" podUID="68a134ea-4533-4317-bf3f-b4e22e808c81" Apr 20 13:31:15.789577 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.789547 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/68a134ea-4533-4317-bf3f-b4e22e808c81-dbus\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:15.789766 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.789615 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:15.789766 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.789695 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/68a134ea-4533-4317-bf3f-b4e22e808c81-kubelet-config\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:15.890703 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.890669 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:15.890848 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.890726 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/68a134ea-4533-4317-bf3f-b4e22e808c81-kubelet-config\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:15.890848 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.890782 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/68a134ea-4533-4317-bf3f-b4e22e808c81-dbus\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:15.890848 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:15.890820 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 13:31:15.890998 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.890877 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/68a134ea-4533-4317-bf3f-b4e22e808c81-kubelet-config\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:15.890998 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:15.890884 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret podName:68a134ea-4533-4317-bf3f-b4e22e808c81 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:16.390870137 +0000 UTC m=+31.954561390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret") pod "global-pull-secret-syncer-ksm7s" (UID: "68a134ea-4533-4317-bf3f-b4e22e808c81") : object "kube-system"/"original-pull-secret" not registered Apr 20 13:31:15.891104 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:15.891008 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/68a134ea-4533-4317-bf3f-b4e22e808c81-dbus\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:16.062028 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:16.061996 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:16.062028 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:16.062035 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:16.062287 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:16.062131 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-x6gsn" podUID="1d19bff6-e6ed-46e3-854b-04097f537694" Apr 20 13:31:16.062287 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:16.062274 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-5pkrd" podUID="02041a2f-e9fd-4902-a9a4-47e4cd2889e4" Apr 20 13:31:16.276268 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:16.276239 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:16.276423 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:16.276374 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ksm7s" podUID="68a134ea-4533-4317-bf3f-b4e22e808c81" Apr 20 13:31:16.394871 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:16.394794 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:16.395012 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:16.394950 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 13:31:16.395074 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:16.395025 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret podName:68a134ea-4533-4317-bf3f-b4e22e808c81 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:17.395008142 +0000 UTC m=+32.958699392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret") pod "global-pull-secret-syncer-ksm7s" (UID: "68a134ea-4533-4317-bf3f-b4e22e808c81") : object "kube-system"/"original-pull-secret" not registered Apr 20 13:31:17.403061 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.403021 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:17.403615 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:17.403181 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 13:31:17.403615 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:17.403262 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret podName:68a134ea-4533-4317-bf3f-b4e22e808c81 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:19.403241962 +0000 UTC m=+34.966933233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret") pod "global-pull-secret-syncer-ksm7s" (UID: "68a134ea-4533-4317-bf3f-b4e22e808c81") : object "kube-system"/"original-pull-secret" not registered Apr 20 13:31:17.705952 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.705860 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:17.706127 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:17.706035 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:17.706127 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:17.706122 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs podName:02041a2f-e9fd-4902-a9a4-47e4cd2889e4 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:49.706104674 +0000 UTC m=+65.269795932 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs") pod "network-metrics-daemon-5pkrd" (UID: "02041a2f-e9fd-4902-a9a4-47e4cd2889e4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 13:31:17.772869 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.772830 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-232.ec2.internal" event="NodeReady" Apr 20 13:31:17.773076 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.772993 2563 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 13:31:17.807178 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.807144 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k6g\" (UniqueName: \"kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g\") pod \"network-check-target-x6gsn\" (UID: \"1d19bff6-e6ed-46e3-854b-04097f537694\") " pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:17.807347 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:17.807306 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 13:31:17.807347 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:17.807330 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 13:31:17.807347 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:17.807341 2563 projected.go:194] Error preparing data for projected volume kube-api-access-m2k6g for pod openshift-network-diagnostics/network-check-target-x6gsn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:17.807472 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:17.807390 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g podName:1d19bff6-e6ed-46e3-854b-04097f537694 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:49.807377314 +0000 UTC m=+65.371068565 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-m2k6g" (UniqueName: "kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g") pod "network-check-target-x6gsn" (UID: "1d19bff6-e6ed-46e3-854b-04097f537694") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 13:31:17.865800 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.865767 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lw64p"] Apr 20 13:31:17.891243 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.891215 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jq28g"] Apr 20 13:31:17.891423 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.891403 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:17.895078 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.895034 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 13:31:17.895613 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.895592 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 13:31:17.895724 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.895627 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 13:31:17.895724 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.895592 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mqqqc\"" Apr 20 13:31:17.908110 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.908087 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jq28g"] Apr 20 13:31:17.908225 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.908115 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lw64p"] Apr 20 13:31:17.908225 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.908214 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:17.910894 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.910873 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 13:31:17.911037 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.910948 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 13:31:17.911037 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:17.910973 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c8wr5\"" Apr 20 13:31:18.008353 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.008270 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd3c7eb-0c40-4911-a067-10cda31de0d7-config-volume\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.008353 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.008330 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j8ph\" (UniqueName: \"kubernetes.io/projected/5fd3c7eb-0c40-4911-a067-10cda31de0d7-kube-api-access-6j8ph\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.008353 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.008359 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:18.008692 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.008387 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.008692 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.008511 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc45d\" (UniqueName: \"kubernetes.io/projected/8c0036f9-a08e-4eac-8ad7-301fe4765604-kube-api-access-wc45d\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:18.008692 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.008578 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5fd3c7eb-0c40-4911-a067-10cda31de0d7-tmp-dir\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.062213 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.062177 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:18.062213 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.062202 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:18.062437 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.062202 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:18.066202 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.066179 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 13:31:18.066351 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.066249 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wf844\"" Apr 20 13:31:18.066351 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.066182 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 13:31:18.066469 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.066182 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 13:31:18.066734 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.066717 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 13:31:18.066870 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.066852 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gqjcl\"" Apr 20 13:31:18.109922 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.109873 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.110119 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.109952 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wc45d\" (UniqueName: \"kubernetes.io/projected/8c0036f9-a08e-4eac-8ad7-301fe4765604-kube-api-access-wc45d\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:18.110119 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.109996 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5fd3c7eb-0c40-4911-a067-10cda31de0d7-tmp-dir\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.110119 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.110039 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd3c7eb-0c40-4911-a067-10cda31de0d7-config-volume\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.110119 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:18.110088 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:18.110119 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.110098 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j8ph\" (UniqueName: \"kubernetes.io/projected/5fd3c7eb-0c40-4911-a067-10cda31de0d7-kube-api-access-6j8ph\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.110119 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.110115 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:18.110344 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:18.110154 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls podName:5fd3c7eb-0c40-4911-a067-10cda31de0d7 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:18.610131328 +0000 UTC m=+34.173822581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls") pod "dns-default-jq28g" (UID: "5fd3c7eb-0c40-4911-a067-10cda31de0d7") : secret "dns-default-metrics-tls" not found Apr 20 13:31:18.110344 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:18.110189 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:18.110344 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:18.110224 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert podName:8c0036f9-a08e-4eac-8ad7-301fe4765604 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:18.610213948 +0000 UTC m=+34.173905198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert") pod "ingress-canary-lw64p" (UID: "8c0036f9-a08e-4eac-8ad7-301fe4765604") : secret "canary-serving-cert" not found Apr 20 13:31:18.110503 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.110494 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5fd3c7eb-0c40-4911-a067-10cda31de0d7-tmp-dir\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.110787 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.110761 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5fd3c7eb-0c40-4911-a067-10cda31de0d7-config-volume\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.131725 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.131687 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc45d\" (UniqueName: \"kubernetes.io/projected/8c0036f9-a08e-4eac-8ad7-301fe4765604-kube-api-access-wc45d\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:18.132198 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.132176 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j8ph\" (UniqueName: \"kubernetes.io/projected/5fd3c7eb-0c40-4911-a067-10cda31de0d7-kube-api-access-6j8ph\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.614841 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.614796 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:18.615435 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:18.614859 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:18.615435 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:18.614964 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:18.615435 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:18.614991 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:18.615435 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:18.615064 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert podName:8c0036f9-a08e-4eac-8ad7-301fe4765604 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:19.615026757 +0000 UTC m=+35.178718005 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert") pod "ingress-canary-lw64p" (UID: "8c0036f9-a08e-4eac-8ad7-301fe4765604") : secret "canary-serving-cert" not found Apr 20 13:31:18.615435 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:18.615084 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls podName:5fd3c7eb-0c40-4911-a067-10cda31de0d7 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:19.615074408 +0000 UTC m=+35.178765663 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls") pod "dns-default-jq28g" (UID: "5fd3c7eb-0c40-4911-a067-10cda31de0d7") : secret "dns-default-metrics-tls" not found Apr 20 13:31:19.421188 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:19.421151 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:19.423524 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:19.423504 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/68a134ea-4533-4317-bf3f-b4e22e808c81-original-pull-secret\") pod \"global-pull-secret-syncer-ksm7s\" (UID: \"68a134ea-4533-4317-bf3f-b4e22e808c81\") " pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:19.574035 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:19.573996 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ksm7s" Apr 20 13:31:19.622364 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:19.622323 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:19.622796 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:19.622371 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:19.622796 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:19.622476 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:19.622796 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:19.622540 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert podName:8c0036f9-a08e-4eac-8ad7-301fe4765604 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:21.62252526 +0000 UTC m=+37.186216512 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert") pod "ingress-canary-lw64p" (UID: "8c0036f9-a08e-4eac-8ad7-301fe4765604") : secret "canary-serving-cert" not found Apr 20 13:31:19.622796 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:19.622478 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:19.622796 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:19.622572 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls podName:5fd3c7eb-0c40-4911-a067-10cda31de0d7 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:21.622566123 +0000 UTC m=+37.186257371 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls") pod "dns-default-jq28g" (UID: "5fd3c7eb-0c40-4911-a067-10cda31de0d7") : secret "dns-default-metrics-tls" not found Apr 20 13:31:19.927346 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:19.927207 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ksm7s"] Apr 20 13:31:19.930739 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:19.930711 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68a134ea_4533_4317_bf3f_b4e22e808c81.slice/crio-314d8bcf186aa46209a82b1420db92a4f3a613520e13b0ef744663b7292a1298 WatchSource:0}: Error finding container 314d8bcf186aa46209a82b1420db92a4f3a613520e13b0ef744663b7292a1298: Status 404 returned error can't find the container with id 314d8bcf186aa46209a82b1420db92a4f3a613520e13b0ef744663b7292a1298 Apr 20 13:31:20.288578 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:20.288542 2563 generic.go:358] "Generic (PLEG): container finished" podID="f22ff795-52da-4095-9d35-f9d44f2b8239" containerID="b5398a99338839fa3fe8f4040bfdc2a7bef0f5004cf873f8dea492e7d61ad9d5" exitCode=0 Apr 20 13:31:20.288739 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:20.288627 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" event={"ID":"f22ff795-52da-4095-9d35-f9d44f2b8239","Type":"ContainerDied","Data":"b5398a99338839fa3fe8f4040bfdc2a7bef0f5004cf873f8dea492e7d61ad9d5"} Apr 20 13:31:20.289682 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:20.289656 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ksm7s" event={"ID":"68a134ea-4533-4317-bf3f-b4e22e808c81","Type":"ContainerStarted","Data":"314d8bcf186aa46209a82b1420db92a4f3a613520e13b0ef744663b7292a1298"} Apr 20 13:31:21.294984 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:21.294759 2563 generic.go:358] "Generic (PLEG): container finished" podID="f22ff795-52da-4095-9d35-f9d44f2b8239" containerID="69477ab38f4121014db9328eaf6a19b6005ffdbce87b40816b8a75c13734fbac" exitCode=0 Apr 20 13:31:21.295464 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:21.294841 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" event={"ID":"f22ff795-52da-4095-9d35-f9d44f2b8239","Type":"ContainerDied","Data":"69477ab38f4121014db9328eaf6a19b6005ffdbce87b40816b8a75c13734fbac"} Apr 20 13:31:21.641538 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:21.641506 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:21.641669 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:21.641559 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:21.641745 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:21.641670 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:21.641745 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:21.641689 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:21.641834 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:21.641759 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert podName:8c0036f9-a08e-4eac-8ad7-301fe4765604 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:25.641734663 +0000 UTC m=+41.205425926 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert") pod "ingress-canary-lw64p" (UID: "8c0036f9-a08e-4eac-8ad7-301fe4765604") : secret "canary-serving-cert" not found Apr 20 13:31:21.641834 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:21.641800 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls podName:5fd3c7eb-0c40-4911-a067-10cda31de0d7 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:25.641771152 +0000 UTC m=+41.205462401 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls") pod "dns-default-jq28g" (UID: "5fd3c7eb-0c40-4911-a067-10cda31de0d7") : secret "dns-default-metrics-tls" not found Apr 20 13:31:22.301297 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.301260 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" event={"ID":"f22ff795-52da-4095-9d35-f9d44f2b8239","Type":"ContainerStarted","Data":"d21852709be22d0dceb2d7170defe545e71aeda56860bc7551824a202ae758f6"} Apr 20 13:31:22.327101 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.327022 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xwkrd" podStartSLOduration=3.833406825 podStartE2EDuration="37.32700459s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:30:46.281157197 +0000 UTC m=+1.844848452" lastFinishedPulling="2026-04-20 13:31:19.774754962 +0000 UTC m=+35.338446217" observedRunningTime="2026-04-20 13:31:22.325001373 +0000 UTC m=+37.888692655" watchObservedRunningTime="2026-04-20 13:31:22.32700459 +0000 UTC m=+37.890695857" Apr 20 13:31:22.514843 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.514802 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq"] Apr 20 13:31:22.537460 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.537434 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk"] Apr 20 13:31:22.537635 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.537599 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq" Apr 20 13:31:22.542308 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.542110 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-qwlkl\"" Apr 20 13:31:22.552689 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.552618 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x"] Apr 20 13:31:22.552805 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.552776 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.555575 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.555289 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:31:22.555575 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.555335 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 13:31:22.555751 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.555628 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 13:31:22.555751 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.555648 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 13:31:22.555751 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.555631 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-csttw\"" Apr 20 13:31:22.572888 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.572864 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq"] Apr 20 13:31:22.573025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.572906 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk"] Apr 20 13:31:22.573025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.572919 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x"] Apr 20 13:31:22.573025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.572921 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:22.575826 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.575804 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 13:31:22.575945 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.575825 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-f2rq5\"" Apr 20 13:31:22.575945 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.575845 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 13:31:22.575945 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.575804 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:31:22.616752 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.616725 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7f94986b8b-ms7wn"] Apr 20 13:31:22.638324 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.638290 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qrddw"] Apr 20 13:31:22.638491 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.638472 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.644085 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.644020 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 13:31:22.646013 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.644610 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 13:31:22.646712 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.646694 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 13:31:22.648760 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.648738 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f159e37-18ec-4deb-a9fa-9b41fad19818-config\") pod \"service-ca-operator-d6fc45fc5-lrfsk\" (UID: \"4f159e37-18ec-4deb-a9fa-9b41fad19818\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.648866 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.648804 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwqlb\" (UniqueName: \"kubernetes.io/projected/4f159e37-18ec-4deb-a9fa-9b41fad19818-kube-api-access-bwqlb\") pod \"service-ca-operator-d6fc45fc5-lrfsk\" (UID: \"4f159e37-18ec-4deb-a9fa-9b41fad19818\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.648866 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.648849 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwvwj\" (UniqueName: \"kubernetes.io/projected/9cf1343c-20e8-44fe-bb3e-59e64970b967-kube-api-access-fwvwj\") pod \"network-check-source-8894fc9bd-mtqqq\" (UID: \"9cf1343c-20e8-44fe-bb3e-59e64970b967\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq" Apr 20 13:31:22.648990 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.648867 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f159e37-18ec-4deb-a9fa-9b41fad19818-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lrfsk\" (UID: \"4f159e37-18ec-4deb-a9fa-9b41fad19818\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.651620 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.651596 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xw967\"" Apr 20 13:31:22.653852 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.653832 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f94986b8b-ms7wn"] Apr 20 13:31:22.653952 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.653860 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qrddw"] Apr 20 13:31:22.654010 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.653981 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.656814 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.656796 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 13:31:22.657168 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.657148 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 13:31:22.657824 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.657803 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 13:31:22.657967 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.657948 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 13:31:22.658079 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.658023 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-mtstl\"" Apr 20 13:31:22.663791 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.663772 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 13:31:22.670106 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.670087 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 13:31:22.731457 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.731425 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss"] Apr 20 13:31:22.745459 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.745430 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss" Apr 20 13:31:22.748354 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.748329 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 13:31:22.748779 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.748760 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 13:31:22.748893 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.748803 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-bwrs9\"" Apr 20 13:31:22.749245 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.749218 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwvwj\" (UniqueName: \"kubernetes.io/projected/9cf1343c-20e8-44fe-bb3e-59e64970b967-kube-api-access-fwvwj\") pod \"network-check-source-8894fc9bd-mtqqq\" (UID: \"9cf1343c-20e8-44fe-bb3e-59e64970b967\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq" Apr 20 13:31:22.749354 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.749289 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f159e37-18ec-4deb-a9fa-9b41fad19818-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lrfsk\" (UID: \"4f159e37-18ec-4deb-a9fa-9b41fad19818\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.750099 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.749386 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-image-registry-private-configuration\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.750200 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750176 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q9xc\" (UniqueName: \"kubernetes.io/projected/e1a7d897-4926-44f1-affc-910b9bae479b-kube-api-access-6q9xc\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:22.750263 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750209 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-trusted-ca\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.750263 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750242 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f159e37-18ec-4deb-a9fa-9b41fad19818-config\") pod \"service-ca-operator-d6fc45fc5-lrfsk\" (UID: \"4f159e37-18ec-4deb-a9fa-9b41fad19818\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.750367 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750269 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9cc52f-4998-4934-a999-16ee91bf3d4a-service-ca-bundle\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.750367 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750326 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.750367 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750358 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5f9cc52f-4998-4934-a999-16ee91bf3d4a-snapshots\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.750503 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750396 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-certificates\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.750503 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750478 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwqlb\" (UniqueName: \"kubernetes.io/projected/4f159e37-18ec-4deb-a9fa-9b41fad19818-kube-api-access-bwqlb\") pod \"service-ca-operator-d6fc45fc5-lrfsk\" (UID: \"4f159e37-18ec-4deb-a9fa-9b41fad19818\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.750605 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750518 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0308410e-09ea-4fe7-82af-d5ff8756db17-ca-trust-extracted\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.750605 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750548 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f9cc52f-4998-4934-a999-16ee91bf3d4a-tmp\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.750605 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750577 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59bvx\" (UniqueName: \"kubernetes.io/projected/5f9cc52f-4998-4934-a999-16ee91bf3d4a-kube-api-access-59bvx\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.750778 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750619 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9cc52f-4998-4934-a999-16ee91bf3d4a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.750778 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750643 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9cc52f-4998-4934-a999-16ee91bf3d4a-serving-cert\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.750778 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750670 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:22.750778 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750695 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-installation-pull-secrets\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.750778 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750720 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-bound-sa-token\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.750778 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750759 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9s6x\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-kube-api-access-w9s6x\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.751010 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.750979 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss"] Apr 20 13:31:22.751156 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.751137 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f159e37-18ec-4deb-a9fa-9b41fad19818-config\") pod \"service-ca-operator-d6fc45fc5-lrfsk\" (UID: \"4f159e37-18ec-4deb-a9fa-9b41fad19818\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.755081 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.754991 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f159e37-18ec-4deb-a9fa-9b41fad19818-serving-cert\") pod \"service-ca-operator-d6fc45fc5-lrfsk\" (UID: \"4f159e37-18ec-4deb-a9fa-9b41fad19818\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.759380 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.759354 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwvwj\" (UniqueName: \"kubernetes.io/projected/9cf1343c-20e8-44fe-bb3e-59e64970b967-kube-api-access-fwvwj\") pod \"network-check-source-8894fc9bd-mtqqq\" (UID: \"9cf1343c-20e8-44fe-bb3e-59e64970b967\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq" Apr 20 13:31:22.763033 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.763007 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwqlb\" (UniqueName: \"kubernetes.io/projected/4f159e37-18ec-4deb-a9fa-9b41fad19818-kube-api-access-bwqlb\") pod \"service-ca-operator-d6fc45fc5-lrfsk\" (UID: \"4f159e37-18ec-4deb-a9fa-9b41fad19818\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.848403 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.848326 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq" Apr 20 13:31:22.851248 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851221 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0308410e-09ea-4fe7-82af-d5ff8756db17-ca-trust-extracted\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.851376 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851253 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9cc52f-4998-4934-a999-16ee91bf3d4a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.851376 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851310 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-trusted-ca\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.851376 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851332 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-bound-sa-token\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.851376 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851348 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9s6x\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-kube-api-access-w9s6x\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.851570 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851431 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59bvx\" (UniqueName: \"kubernetes.io/projected/5f9cc52f-4998-4934-a999-16ee91bf3d4a-kube-api-access-59bvx\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.851570 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851478 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f9cc52f-4998-4934-a999-16ee91bf3d4a-tmp\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.851570 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851508 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9cc52f-4998-4934-a999-16ee91bf3d4a-serving-cert\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.851570 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851536 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:22.851570 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851565 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-installation-pull-secrets\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.851803 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851597 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6b96\" (UniqueName: \"kubernetes.io/projected/74ecab64-c701-4519-98e0-66ade918c111-kube-api-access-b6b96\") pod \"volume-data-source-validator-7c6cbb6c87-gzsss\" (UID: \"74ecab64-c701-4519-98e0-66ade918c111\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss" Apr 20 13:31:22.851803 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851618 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5f9cc52f-4998-4934-a999-16ee91bf3d4a-snapshots\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.851803 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851644 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-image-registry-private-configuration\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.851803 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851660 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q9xc\" (UniqueName: \"kubernetes.io/projected/e1a7d897-4926-44f1-affc-910b9bae479b-kube-api-access-6q9xc\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:22.851803 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851662 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0308410e-09ea-4fe7-82af-d5ff8756db17-ca-trust-extracted\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.851803 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851686 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9cc52f-4998-4934-a999-16ee91bf3d4a-service-ca-bundle\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.851803 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.851732 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.852156 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.852135 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9cc52f-4998-4934-a999-16ee91bf3d4a-service-ca-bundle\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.852385 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.852362 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5f9cc52f-4998-4934-a999-16ee91bf3d4a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.852662 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.852477 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-trusted-ca\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.852662 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:22.852611 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 13:31:22.852662 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.852620 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5f9cc52f-4998-4934-a999-16ee91bf3d4a-tmp\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.852878 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:22.852672 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:22.852878 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:22.852683 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f94986b8b-ms7wn: secret "image-registry-tls" not found Apr 20 13:31:22.852878 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:22.852675 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls podName:e1a7d897-4926-44f1-affc-910b9bae479b nodeName:}" failed. No retries permitted until 2026-04-20 13:31:23.352656641 +0000 UTC m=+38.916347921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ws27x" (UID: "e1a7d897-4926-44f1-affc-910b9bae479b") : secret "samples-operator-tls" not found Apr 20 13:31:22.852878 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.852723 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-certificates\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.852878 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.852736 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/5f9cc52f-4998-4934-a999-16ee91bf3d4a-snapshots\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.852878 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:22.852770 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls podName:0308410e-09ea-4fe7-82af-d5ff8756db17 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:23.352759169 +0000 UTC m=+38.916450418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls") pod "image-registry-7f94986b8b-ms7wn" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17") : secret "image-registry-tls" not found Apr 20 13:31:22.853309 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.853268 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-certificates\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.854371 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.854350 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9cc52f-4998-4934-a999-16ee91bf3d4a-serving-cert\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.854464 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.854409 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-image-registry-private-configuration\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.855515 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.855496 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-installation-pull-secrets\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.860553 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.860532 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-bound-sa-token\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.860553 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.860531 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59bvx\" (UniqueName: \"kubernetes.io/projected/5f9cc52f-4998-4934-a999-16ee91bf3d4a-kube-api-access-59bvx\") pod \"insights-operator-585dfdc468-qrddw\" (UID: \"5f9cc52f-4998-4934-a999-16ee91bf3d4a\") " pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:22.860791 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.860711 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9s6x\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-kube-api-access-w9s6x\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:22.861506 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.861486 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q9xc\" (UniqueName: \"kubernetes.io/projected/e1a7d897-4926-44f1-affc-910b9bae479b-kube-api-access-6q9xc\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:22.863904 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.863885 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" Apr 20 13:31:22.953809 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.953771 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6b96\" (UniqueName: \"kubernetes.io/projected/74ecab64-c701-4519-98e0-66ade918c111-kube-api-access-b6b96\") pod \"volume-data-source-validator-7c6cbb6c87-gzsss\" (UID: \"74ecab64-c701-4519-98e0-66ade918c111\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss" Apr 20 13:31:22.963460 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.963429 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6b96\" (UniqueName: \"kubernetes.io/projected/74ecab64-c701-4519-98e0-66ade918c111-kube-api-access-b6b96\") pod \"volume-data-source-validator-7c6cbb6c87-gzsss\" (UID: \"74ecab64-c701-4519-98e0-66ade918c111\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss" Apr 20 13:31:22.970219 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:22.970182 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-qrddw" Apr 20 13:31:23.075743 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:23.075708 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss" Apr 20 13:31:23.357030 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:23.356978 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:23.357656 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:23.357102 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:23.357656 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:23.357167 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 13:31:23.357656 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:23.357224 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:23.357656 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:23.357237 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f94986b8b-ms7wn: secret "image-registry-tls" not found Apr 20 13:31:23.357656 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:23.357247 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls podName:e1a7d897-4926-44f1-affc-910b9bae479b nodeName:}" failed. No retries permitted until 2026-04-20 13:31:24.357226054 +0000 UTC m=+39.920917309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ws27x" (UID: "e1a7d897-4926-44f1-affc-910b9bae479b") : secret "samples-operator-tls" not found Apr 20 13:31:23.357656 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:23.357284 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls podName:0308410e-09ea-4fe7-82af-d5ff8756db17 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:24.357272115 +0000 UTC m=+39.920963370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls") pod "image-registry-7f94986b8b-ms7wn" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17") : secret "image-registry-tls" not found Apr 20 13:31:23.791667 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:23.791634 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk"] Apr 20 13:31:23.825202 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:23.825158 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f159e37_18ec_4deb_a9fa_9b41fad19818.slice/crio-8bc5407d5d3ffb99da02f3735cec30de010bdfceebf6cf6152bebde8dacc40a5 WatchSource:0}: Error finding container 8bc5407d5d3ffb99da02f3735cec30de010bdfceebf6cf6152bebde8dacc40a5: Status 404 returned error can't find the container with id 8bc5407d5d3ffb99da02f3735cec30de010bdfceebf6cf6152bebde8dacc40a5 Apr 20 13:31:24.012400 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.012370 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq"] Apr 20 13:31:24.016152 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.016104 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss"] Apr 20 13:31:24.016536 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:24.016488 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cf1343c_20e8_44fe_bb3e_59e64970b967.slice/crio-9104635f1474709df6bfc8e18af6ec2f99c0c2f1a5c300ed2a29b0f93e908789 WatchSource:0}: Error finding container 9104635f1474709df6bfc8e18af6ec2f99c0c2f1a5c300ed2a29b0f93e908789: Status 404 returned error can't find the container with id 9104635f1474709df6bfc8e18af6ec2f99c0c2f1a5c300ed2a29b0f93e908789 Apr 20 13:31:24.017283 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.017254 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-qrddw"] Apr 20 13:31:24.021814 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:24.021789 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f9cc52f_4998_4934_a999_16ee91bf3d4a.slice/crio-a2da1657a49003d19111d3340faa2602bfd8e959ef533cff33bf0ffc4881b3a4 WatchSource:0}: Error finding container a2da1657a49003d19111d3340faa2602bfd8e959ef533cff33bf0ffc4881b3a4: Status 404 returned error can't find the container with id a2da1657a49003d19111d3340faa2602bfd8e959ef533cff33bf0ffc4881b3a4 Apr 20 13:31:24.306757 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.306708 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ksm7s" event={"ID":"68a134ea-4533-4317-bf3f-b4e22e808c81","Type":"ContainerStarted","Data":"57f8de0921caa46518e153fbc30468a5886fddf685dfaf0ed2e57c8e0106ddd2"} Apr 20 13:31:24.307802 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.307773 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq" event={"ID":"9cf1343c-20e8-44fe-bb3e-59e64970b967","Type":"ContainerStarted","Data":"9104635f1474709df6bfc8e18af6ec2f99c0c2f1a5c300ed2a29b0f93e908789"} Apr 20 13:31:24.308693 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.308675 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss" event={"ID":"74ecab64-c701-4519-98e0-66ade918c111","Type":"ContainerStarted","Data":"856ff9cdaeabad968955b500f8c82864b91bfdeaf7d1b103b6ab42e541c3a4c1"} Apr 20 13:31:24.312420 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.309980 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qrddw" event={"ID":"5f9cc52f-4998-4934-a999-16ee91bf3d4a","Type":"ContainerStarted","Data":"a2da1657a49003d19111d3340faa2602bfd8e959ef533cff33bf0ffc4881b3a4"} Apr 20 13:31:24.313697 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.313671 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" event={"ID":"4f159e37-18ec-4deb-a9fa-9b41fad19818","Type":"ContainerStarted","Data":"8bc5407d5d3ffb99da02f3735cec30de010bdfceebf6cf6152bebde8dacc40a5"} Apr 20 13:31:24.327256 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.327212 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ksm7s" podStartSLOduration=5.409600614 podStartE2EDuration="9.327198592s" podCreationTimestamp="2026-04-20 13:31:15 +0000 UTC" firstStartedPulling="2026-04-20 13:31:19.932917927 +0000 UTC m=+35.496609180" lastFinishedPulling="2026-04-20 13:31:23.850515897 +0000 UTC m=+39.414207158" observedRunningTime="2026-04-20 13:31:24.326584217 +0000 UTC m=+39.890275486" watchObservedRunningTime="2026-04-20 13:31:24.327198592 +0000 UTC m=+39.890889890" Apr 20 13:31:24.367134 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.367099 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:24.367518 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:24.367174 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:24.367518 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:24.367248 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 13:31:24.367518 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:24.367311 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:24.367518 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:24.367328 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f94986b8b-ms7wn: secret "image-registry-tls" not found Apr 20 13:31:24.367518 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:24.367314 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls podName:e1a7d897-4926-44f1-affc-910b9bae479b nodeName:}" failed. No retries permitted until 2026-04-20 13:31:26.367299272 +0000 UTC m=+41.930990519 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ws27x" (UID: "e1a7d897-4926-44f1-affc-910b9bae479b") : secret "samples-operator-tls" not found Apr 20 13:31:24.367518 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:24.367375 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls podName:0308410e-09ea-4fe7-82af-d5ff8756db17 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:26.367362861 +0000 UTC m=+41.931054112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls") pod "image-registry-7f94986b8b-ms7wn" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17") : secret "image-registry-tls" not found Apr 20 13:31:25.679645 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:25.679574 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:25.680126 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:25.679771 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:25.680126 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:25.680096 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:25.680239 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:25.680176 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert podName:8c0036f9-a08e-4eac-8ad7-301fe4765604 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:33.680161407 +0000 UTC m=+49.243852654 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert") pod "ingress-canary-lw64p" (UID: "8c0036f9-a08e-4eac-8ad7-301fe4765604") : secret "canary-serving-cert" not found Apr 20 13:31:25.680613 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:25.680528 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:25.680613 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:25.680568 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls podName:5fd3c7eb-0c40-4911-a067-10cda31de0d7 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:33.680558672 +0000 UTC m=+49.244249920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls") pod "dns-default-jq28g" (UID: "5fd3c7eb-0c40-4911-a067-10cda31de0d7") : secret "dns-default-metrics-tls" not found Apr 20 13:31:26.386197 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:26.386159 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:26.386366 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:26.386250 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:26.386366 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:26.386312 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:26.386366 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:26.386333 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f94986b8b-ms7wn: secret "image-registry-tls" not found Apr 20 13:31:26.386366 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:26.386352 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 13:31:26.386495 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:26.386388 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls podName:0308410e-09ea-4fe7-82af-d5ff8756db17 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:30.386372837 +0000 UTC m=+45.950064085 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls") pod "image-registry-7f94986b8b-ms7wn" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17") : secret "image-registry-tls" not found Apr 20 13:31:26.386495 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:26.386401 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls podName:e1a7d897-4926-44f1-affc-910b9bae479b nodeName:}" failed. No retries permitted until 2026-04-20 13:31:30.386395341 +0000 UTC m=+45.950086588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ws27x" (UID: "e1a7d897-4926-44f1-affc-910b9bae479b") : secret "samples-operator-tls" not found Apr 20 13:31:29.327645 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:29.327608 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq" event={"ID":"9cf1343c-20e8-44fe-bb3e-59e64970b967","Type":"ContainerStarted","Data":"df356b448688ddbb56f2bd527b84afa77e3f3b42f8ee328be03809448320e297"} Apr 20 13:31:29.329108 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:29.329081 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss" event={"ID":"74ecab64-c701-4519-98e0-66ade918c111","Type":"ContainerStarted","Data":"ac022f44a3e7935d3773cb3a4b81175f00ea1ca5ca97082c61f3c716fe1648d4"} Apr 20 13:31:29.330528 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:29.330502 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qrddw" event={"ID":"5f9cc52f-4998-4934-a999-16ee91bf3d4a","Type":"ContainerStarted","Data":"b4f5b4ff684648b586f778b562940e381f0a293822c8cf436a332232dbb3126b"} Apr 20 13:31:29.331855 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:29.331833 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" event={"ID":"4f159e37-18ec-4deb-a9fa-9b41fad19818","Type":"ContainerStarted","Data":"fa2ca5fc85d16b6c0f68403d04a69b0feb1149fbfb3c16f23cefd710eaa38d6e"} Apr 20 13:31:29.344840 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:29.344787 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-mtqqq" podStartSLOduration=2.460338115 podStartE2EDuration="7.344770799s" podCreationTimestamp="2026-04-20 13:31:22 +0000 UTC" firstStartedPulling="2026-04-20 13:31:24.018922358 +0000 UTC m=+39.582613609" lastFinishedPulling="2026-04-20 13:31:28.903355045 +0000 UTC m=+44.467046293" observedRunningTime="2026-04-20 13:31:29.343211012 +0000 UTC m=+44.906902283" watchObservedRunningTime="2026-04-20 13:31:29.344770799 +0000 UTC m=+44.908462072" Apr 20 13:31:29.361742 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:29.361696 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-gzsss" podStartSLOduration=2.4893146809999998 podStartE2EDuration="7.361685934s" podCreationTimestamp="2026-04-20 13:31:22 +0000 UTC" firstStartedPulling="2026-04-20 13:31:24.023869135 +0000 UTC m=+39.587560386" lastFinishedPulling="2026-04-20 13:31:28.896240388 +0000 UTC m=+44.459931639" observedRunningTime="2026-04-20 13:31:29.360799859 +0000 UTC m=+44.924491130" watchObservedRunningTime="2026-04-20 13:31:29.361685934 +0000 UTC m=+44.925377208" Apr 20 13:31:29.376174 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:29.375900 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" podStartSLOduration=2.319295345 podStartE2EDuration="7.37588544s" podCreationTimestamp="2026-04-20 13:31:22 +0000 UTC" firstStartedPulling="2026-04-20 13:31:23.83796198 +0000 UTC m=+39.401653232" lastFinishedPulling="2026-04-20 13:31:28.894552062 +0000 UTC m=+44.458243327" observedRunningTime="2026-04-20 13:31:29.375842144 +0000 UTC m=+44.939533415" watchObservedRunningTime="2026-04-20 13:31:29.37588544 +0000 UTC m=+44.939576710" Apr 20 13:31:29.392558 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:29.392503 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-qrddw" podStartSLOduration=2.520747236 podStartE2EDuration="7.39248315s" podCreationTimestamp="2026-04-20 13:31:22 +0000 UTC" firstStartedPulling="2026-04-20 13:31:24.024816586 +0000 UTC m=+39.588507838" lastFinishedPulling="2026-04-20 13:31:28.896552489 +0000 UTC m=+44.460243752" observedRunningTime="2026-04-20 13:31:29.391585209 +0000 UTC m=+44.955276480" watchObservedRunningTime="2026-04-20 13:31:29.39248315 +0000 UTC m=+44.956174423" Apr 20 13:31:30.418974 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:30.418933 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:30.419406 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:30.419030 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:30.419406 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:30.419139 2563 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 13:31:30.419406 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:30.419137 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:30.419406 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:30.419165 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f94986b8b-ms7wn: secret "image-registry-tls" not found Apr 20 13:31:30.419406 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:30.419202 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls podName:e1a7d897-4926-44f1-affc-910b9bae479b nodeName:}" failed. No retries permitted until 2026-04-20 13:31:38.419183169 +0000 UTC m=+53.982874435 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ws27x" (UID: "e1a7d897-4926-44f1-affc-910b9bae479b") : secret "samples-operator-tls" not found Apr 20 13:31:30.419406 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:30.419228 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls podName:0308410e-09ea-4fe7-82af-d5ff8756db17 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:38.419210923 +0000 UTC m=+53.982902176 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls") pod "image-registry-7f94986b8b-ms7wn" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17") : secret "image-registry-tls" not found Apr 20 13:31:33.046656 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.046620 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qdwz4"] Apr 20 13:31:33.074979 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.074946 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.077714 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.077689 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 13:31:33.077930 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.077893 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 13:31:33.077930 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.077893 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vbdzl\"" Apr 20 13:31:33.078139 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.078122 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qdwz4"] Apr 20 13:31:33.114380 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.114354 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rxqc7_8ca98088-8b65-4efe-ad4e-3df5a8fe02b5/dns-node-resolver/0.log" Apr 20 13:31:33.142779 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.142738 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.142978 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.142831 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2049f90-5ce4-4282-9210-29370c7e0bda-crio-socket\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.142978 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.142861 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2049f90-5ce4-4282-9210-29370c7e0bda-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.142978 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.142892 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2049f90-5ce4-4282-9210-29370c7e0bda-data-volume\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.143118 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.142990 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjq84\" (UniqueName: \"kubernetes.io/projected/b2049f90-5ce4-4282-9210-29370c7e0bda-kube-api-access-tjq84\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.205465 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.205434 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jl466"] Apr 20 13:31:33.235025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.234990 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jl466"] Apr 20 13:31:33.235233 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.235140 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.237800 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.237773 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 13:31:33.237800 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.237803 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 13:31:33.238038 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.237874 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 13:31:33.238038 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.238036 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 13:31:33.238713 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.238690 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-nbfx6\"" Apr 20 13:31:33.243551 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.243530 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2049f90-5ce4-4282-9210-29370c7e0bda-data-volume\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.243653 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.243582 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjq84\" (UniqueName: \"kubernetes.io/projected/b2049f90-5ce4-4282-9210-29370c7e0bda-kube-api-access-tjq84\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.243653 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.243607 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fb827976-50a7-4c30-9c52-38e78c5e390c-signing-cabundle\") pod \"service-ca-865cb79987-jl466\" (UID: \"fb827976-50a7-4c30-9c52-38e78c5e390c\") " pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.243653 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.243644 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.243778 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.243683 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fb827976-50a7-4c30-9c52-38e78c5e390c-signing-key\") pod \"service-ca-865cb79987-jl466\" (UID: \"fb827976-50a7-4c30-9c52-38e78c5e390c\") " pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.243778 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.243711 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n64nn\" (UniqueName: \"kubernetes.io/projected/fb827976-50a7-4c30-9c52-38e78c5e390c-kube-api-access-n64nn\") pod \"service-ca-865cb79987-jl466\" (UID: \"fb827976-50a7-4c30-9c52-38e78c5e390c\") " pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.243778 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.243749 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2049f90-5ce4-4282-9210-29370c7e0bda-crio-socket\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.243778 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:33.243765 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 13:31:33.243951 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.243798 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2049f90-5ce4-4282-9210-29370c7e0bda-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.243951 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:33.243824 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls podName:b2049f90-5ce4-4282-9210-29370c7e0bda nodeName:}" failed. No retries permitted until 2026-04-20 13:31:33.743805551 +0000 UTC m=+49.307496799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls") pod "insights-runtime-extractor-qdwz4" (UID: "b2049f90-5ce4-4282-9210-29370c7e0bda") : secret "insights-runtime-extractor-tls" not found Apr 20 13:31:33.243951 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.243862 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b2049f90-5ce4-4282-9210-29370c7e0bda-data-volume\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.243951 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.243931 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b2049f90-5ce4-4282-9210-29370c7e0bda-crio-socket\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.244183 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.244166 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b2049f90-5ce4-4282-9210-29370c7e0bda-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.253370 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.253348 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjq84\" (UniqueName: \"kubernetes.io/projected/b2049f90-5ce4-4282-9210-29370c7e0bda-kube-api-access-tjq84\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.345004 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.344926 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fb827976-50a7-4c30-9c52-38e78c5e390c-signing-cabundle\") pod \"service-ca-865cb79987-jl466\" (UID: \"fb827976-50a7-4c30-9c52-38e78c5e390c\") " pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.345165 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.345014 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fb827976-50a7-4c30-9c52-38e78c5e390c-signing-key\") pod \"service-ca-865cb79987-jl466\" (UID: \"fb827976-50a7-4c30-9c52-38e78c5e390c\") " pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.345165 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.345043 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n64nn\" (UniqueName: \"kubernetes.io/projected/fb827976-50a7-4c30-9c52-38e78c5e390c-kube-api-access-n64nn\") pod \"service-ca-865cb79987-jl466\" (UID: \"fb827976-50a7-4c30-9c52-38e78c5e390c\") " pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.345682 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.345657 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fb827976-50a7-4c30-9c52-38e78c5e390c-signing-cabundle\") pod \"service-ca-865cb79987-jl466\" (UID: \"fb827976-50a7-4c30-9c52-38e78c5e390c\") " pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.347687 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.347665 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fb827976-50a7-4c30-9c52-38e78c5e390c-signing-key\") pod \"service-ca-865cb79987-jl466\" (UID: \"fb827976-50a7-4c30-9c52-38e78c5e390c\") " pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.354311 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.354287 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n64nn\" (UniqueName: \"kubernetes.io/projected/fb827976-50a7-4c30-9c52-38e78c5e390c-kube-api-access-n64nn\") pod \"service-ca-865cb79987-jl466\" (UID: \"fb827976-50a7-4c30-9c52-38e78c5e390c\") " pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.544211 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.544175 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jl466" Apr 20 13:31:33.664975 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.664829 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jl466"] Apr 20 13:31:33.714151 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.714128 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hw898_b1a9fc24-9a0e-4d24-aa45-ec1711e1399a/node-ca/0.log" Apr 20 13:31:33.747185 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.747152 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:33.747349 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.747228 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:33.747349 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:33.747301 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 13:31:33.747349 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:33.747328 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 13:31:33.747479 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:33.747357 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert podName:8c0036f9-a08e-4eac-8ad7-301fe4765604 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:49.74734244 +0000 UTC m=+65.311033688 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert") pod "ingress-canary-lw64p" (UID: "8c0036f9-a08e-4eac-8ad7-301fe4765604") : secret "canary-serving-cert" not found Apr 20 13:31:33.747479 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:33.747380 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls podName:b2049f90-5ce4-4282-9210-29370c7e0bda nodeName:}" failed. No retries permitted until 2026-04-20 13:31:34.747363783 +0000 UTC m=+50.311055048 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls") pod "insights-runtime-extractor-qdwz4" (UID: "b2049f90-5ce4-4282-9210-29370c7e0bda") : secret "insights-runtime-extractor-tls" not found Apr 20 13:31:33.747479 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:33.747448 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:33.747654 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:33.747564 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 13:31:33.747654 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:33.747610 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls podName:5fd3c7eb-0c40-4911-a067-10cda31de0d7 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:49.747597647 +0000 UTC m=+65.311288913 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls") pod "dns-default-jq28g" (UID: "5fd3c7eb-0c40-4911-a067-10cda31de0d7") : secret "dns-default-metrics-tls" not found Apr 20 13:31:34.346012 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:34.345972 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jl466" event={"ID":"fb827976-50a7-4c30-9c52-38e78c5e390c","Type":"ContainerStarted","Data":"04c78f2cdd3114e4c69cc86bf3ee6d1aa37e6925fa5730c59208cb82fa29c8f3"} Apr 20 13:31:34.346012 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:34.346018 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jl466" event={"ID":"fb827976-50a7-4c30-9c52-38e78c5e390c","Type":"ContainerStarted","Data":"aad97d626e510ec1076587ff2bad0869c5421948d140026e6244062f670a1668"} Apr 20 13:31:34.363645 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:34.363603 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-jl466" podStartSLOduration=1.363590421 podStartE2EDuration="1.363590421s" podCreationTimestamp="2026-04-20 13:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:31:34.362458247 +0000 UTC m=+49.926149540" watchObservedRunningTime="2026-04-20 13:31:34.363590421 +0000 UTC m=+49.927281690" Apr 20 13:31:34.753474 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:34.753390 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:34.753627 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:34.753534 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 13:31:34.753627 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:34.753601 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls podName:b2049f90-5ce4-4282-9210-29370c7e0bda nodeName:}" failed. No retries permitted until 2026-04-20 13:31:36.753584317 +0000 UTC m=+52.317275569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls") pod "insights-runtime-extractor-qdwz4" (UID: "b2049f90-5ce4-4282-9210-29370c7e0bda") : secret "insights-runtime-extractor-tls" not found Apr 20 13:31:36.770685 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:36.770634 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:36.771195 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:36.770801 2563 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 13:31:36.771195 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:36.770871 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls podName:b2049f90-5ce4-4282-9210-29370c7e0bda nodeName:}" failed. No retries permitted until 2026-04-20 13:31:40.77085598 +0000 UTC m=+56.334547227 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls") pod "insights-runtime-extractor-qdwz4" (UID: "b2049f90-5ce4-4282-9210-29370c7e0bda") : secret "insights-runtime-extractor-tls" not found Apr 20 13:31:38.484737 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:38.484699 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:38.485185 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:38.484760 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:38.485185 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:38.484872 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 13:31:38.485185 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:38.484891 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7f94986b8b-ms7wn: secret "image-registry-tls" not found Apr 20 13:31:38.485185 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:38.484962 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls podName:0308410e-09ea-4fe7-82af-d5ff8756db17 nodeName:}" failed. No retries permitted until 2026-04-20 13:31:54.484945762 +0000 UTC m=+70.048637014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls") pod "image-registry-7f94986b8b-ms7wn" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17") : secret "image-registry-tls" not found Apr 20 13:31:38.487331 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:38.487307 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e1a7d897-4926-44f1-affc-910b9bae479b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ws27x\" (UID: \"e1a7d897-4926-44f1-affc-910b9bae479b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:38.783847 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:38.783811 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" Apr 20 13:31:38.909339 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:38.909207 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x"] Apr 20 13:31:39.363260 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:39.363221 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" event={"ID":"e1a7d897-4926-44f1-affc-910b9bae479b","Type":"ContainerStarted","Data":"e41bac097dab2693c35a6c7c38e75ee60341439428b47219268ae70e2ca615cc"} Apr 20 13:31:40.805341 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:40.805296 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:40.808141 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:40.808112 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b2049f90-5ce4-4282-9210-29370c7e0bda-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qdwz4\" (UID: \"b2049f90-5ce4-4282-9210-29370c7e0bda\") " pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:40.884334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:40.883864 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qdwz4" Apr 20 13:31:41.050005 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:41.049979 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qdwz4"] Apr 20 13:31:41.053707 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:41.053678 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2049f90_5ce4_4282_9210_29370c7e0bda.slice/crio-e79b13667a1764c6d80f052cecffd320ab22b0fec88914d7aac1814484566085 WatchSource:0}: Error finding container e79b13667a1764c6d80f052cecffd320ab22b0fec88914d7aac1814484566085: Status 404 returned error can't find the container with id e79b13667a1764c6d80f052cecffd320ab22b0fec88914d7aac1814484566085 Apr 20 13:31:41.370770 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:41.370680 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" event={"ID":"e1a7d897-4926-44f1-affc-910b9bae479b","Type":"ContainerStarted","Data":"b1539e637438cf39f58525a1b8fd1844ac0705896c56735628a061b0ebb8e369"} Apr 20 13:31:41.370770 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:41.370723 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" event={"ID":"e1a7d897-4926-44f1-affc-910b9bae479b","Type":"ContainerStarted","Data":"9d9ea18aaffffb37465d4f46d812ab6b6fb7546fe2aea1fb6cc6200d17fa9b9d"} Apr 20 13:31:41.372041 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:41.372013 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qdwz4" event={"ID":"b2049f90-5ce4-4282-9210-29370c7e0bda","Type":"ContainerStarted","Data":"300deed7b9cb3edc97d9276be84a0216516d6acffe79d1dc491b9825a3e062eb"} Apr 20 13:31:41.372041 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:41.372066 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qdwz4" event={"ID":"b2049f90-5ce4-4282-9210-29370c7e0bda","Type":"ContainerStarted","Data":"e79b13667a1764c6d80f052cecffd320ab22b0fec88914d7aac1814484566085"} Apr 20 13:31:41.390856 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:41.390807 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ws27x" podStartSLOduration=17.372377718 podStartE2EDuration="19.390789404s" podCreationTimestamp="2026-04-20 13:31:22 +0000 UTC" firstStartedPulling="2026-04-20 13:31:38.959698117 +0000 UTC m=+54.523389365" lastFinishedPulling="2026-04-20 13:31:40.978109793 +0000 UTC m=+56.541801051" observedRunningTime="2026-04-20 13:31:41.389693614 +0000 UTC m=+56.953384886" watchObservedRunningTime="2026-04-20 13:31:41.390789404 +0000 UTC m=+56.954480705" Apr 20 13:31:42.376296 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:42.376260 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qdwz4" event={"ID":"b2049f90-5ce4-4282-9210-29370c7e0bda","Type":"ContainerStarted","Data":"da248e7a68724c24f24dd1307b1bf3040cd5fe21ca688a10adce2914b3613427"} Apr 20 13:31:43.283075 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:43.282727 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mp62t" Apr 20 13:31:44.383082 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:44.383035 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qdwz4" event={"ID":"b2049f90-5ce4-4282-9210-29370c7e0bda","Type":"ContainerStarted","Data":"93528745db64faba2a3db0d1eb0d428a55e7baff79928ac5c39695e5b4dd5489"} Apr 20 13:31:44.403546 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:44.403494 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qdwz4" podStartSLOduration=9.123759175 podStartE2EDuration="11.403477401s" podCreationTimestamp="2026-04-20 13:31:33 +0000 UTC" firstStartedPulling="2026-04-20 13:31:41.11495902 +0000 UTC m=+56.678650268" lastFinishedPulling="2026-04-20 13:31:43.394677229 +0000 UTC m=+58.958368494" observedRunningTime="2026-04-20 13:31:44.40244289 +0000 UTC m=+59.966134160" watchObservedRunningTime="2026-04-20 13:31:44.403477401 +0000 UTC m=+59.967168670" Apr 20 13:31:49.782552 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.782508 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:49.782552 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.782554 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:49.783208 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.782604 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:49.785144 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.785116 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8c0036f9-a08e-4eac-8ad7-301fe4765604-cert\") pod \"ingress-canary-lw64p\" (UID: \"8c0036f9-a08e-4eac-8ad7-301fe4765604\") " pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:49.785280 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.785125 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5fd3c7eb-0c40-4911-a067-10cda31de0d7-metrics-tls\") pod \"dns-default-jq28g\" (UID: \"5fd3c7eb-0c40-4911-a067-10cda31de0d7\") " pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:49.785529 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.785510 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 13:31:49.795690 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.795666 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/02041a2f-e9fd-4902-a9a4-47e4cd2889e4-metrics-certs\") pod \"network-metrics-daemon-5pkrd\" (UID: \"02041a2f-e9fd-4902-a9a4-47e4cd2889e4\") " pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:49.883411 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.883364 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2k6g\" (UniqueName: \"kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g\") pod \"network-check-target-x6gsn\" (UID: \"1d19bff6-e6ed-46e3-854b-04097f537694\") " pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:49.883659 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.883640 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gqjcl\"" Apr 20 13:31:49.886104 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.886082 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2k6g\" (UniqueName: \"kubernetes.io/projected/1d19bff6-e6ed-46e3-854b-04097f537694-kube-api-access-m2k6g\") pod \"network-check-target-x6gsn\" (UID: \"1d19bff6-e6ed-46e3-854b-04097f537694\") " pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:49.888246 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.888225 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wf844\"" Apr 20 13:31:49.891180 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.891163 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-5pkrd" Apr 20 13:31:49.896902 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:49.896886 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:50.006249 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.006222 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mqqqc\"" Apr 20 13:31:50.014297 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.014268 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lw64p" Apr 20 13:31:50.022025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.021945 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-5pkrd"] Apr 20 13:31:50.024647 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.024620 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-c8wr5\"" Apr 20 13:31:50.025338 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:50.025307 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02041a2f_e9fd_4902_a9a4_47e4cd2889e4.slice/crio-81aea566b866d99147eb329c30fd3dea825b009fdf1aed8beda7fa9b22b35c90 WatchSource:0}: Error finding container 81aea566b866d99147eb329c30fd3dea825b009fdf1aed8beda7fa9b22b35c90: Status 404 returned error can't find the container with id 81aea566b866d99147eb329c30fd3dea825b009fdf1aed8beda7fa9b22b35c90 Apr 20 13:31:50.032126 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.032108 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:50.040915 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.040884 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-x6gsn"] Apr 20 13:31:50.043683 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:50.043652 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d19bff6_e6ed_46e3_854b_04097f537694.slice/crio-cf42e45d95792dc4c400ea6efedf95f1a17ab649b5d5240e97831135721934f8 WatchSource:0}: Error finding container cf42e45d95792dc4c400ea6efedf95f1a17ab649b5d5240e97831135721934f8: Status 404 returned error can't find the container with id cf42e45d95792dc4c400ea6efedf95f1a17ab649b5d5240e97831135721934f8 Apr 20 13:31:50.185435 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.185381 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lw64p"] Apr 20 13:31:50.188757 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:50.188734 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c0036f9_a08e_4eac_8ad7_301fe4765604.slice/crio-165b87690ad4ad4e4f44d980bc8da3c3dac29fb133867155ea765a8ab96e4e38 WatchSource:0}: Error finding container 165b87690ad4ad4e4f44d980bc8da3c3dac29fb133867155ea765a8ab96e4e38: Status 404 returned error can't find the container with id 165b87690ad4ad4e4f44d980bc8da3c3dac29fb133867155ea765a8ab96e4e38 Apr 20 13:31:50.209353 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.209323 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jq28g"] Apr 20 13:31:50.212480 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:50.212448 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fd3c7eb_0c40_4911_a067_10cda31de0d7.slice/crio-21ff16b285845ee89cbd3c9ff23e64240715fe8fd2d9bc432172507f33757367 WatchSource:0}: Error finding container 21ff16b285845ee89cbd3c9ff23e64240715fe8fd2d9bc432172507f33757367: Status 404 returned error can't find the container with id 21ff16b285845ee89cbd3c9ff23e64240715fe8fd2d9bc432172507f33757367 Apr 20 13:31:50.400941 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.400844 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jq28g" event={"ID":"5fd3c7eb-0c40-4911-a067-10cda31de0d7","Type":"ContainerStarted","Data":"21ff16b285845ee89cbd3c9ff23e64240715fe8fd2d9bc432172507f33757367"} Apr 20 13:31:50.401927 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.401901 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5pkrd" event={"ID":"02041a2f-e9fd-4902-a9a4-47e4cd2889e4","Type":"ContainerStarted","Data":"81aea566b866d99147eb329c30fd3dea825b009fdf1aed8beda7fa9b22b35c90"} Apr 20 13:31:50.402833 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.402809 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lw64p" event={"ID":"8c0036f9-a08e-4eac-8ad7-301fe4765604","Type":"ContainerStarted","Data":"165b87690ad4ad4e4f44d980bc8da3c3dac29fb133867155ea765a8ab96e4e38"} Apr 20 13:31:50.403961 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.403939 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x6gsn" event={"ID":"1d19bff6-e6ed-46e3-854b-04097f537694","Type":"ContainerStarted","Data":"5c6c52c6d72681689c502ff6467b5817c008320e0a9b1eb7173c4d087f55801e"} Apr 20 13:31:50.404040 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.403966 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-x6gsn" event={"ID":"1d19bff6-e6ed-46e3-854b-04097f537694","Type":"ContainerStarted","Data":"cf42e45d95792dc4c400ea6efedf95f1a17ab649b5d5240e97831135721934f8"} Apr 20 13:31:50.404128 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.404114 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:31:50.428085 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:50.428014 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-x6gsn" podStartSLOduration=65.428000157 podStartE2EDuration="1m5.428000157s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:31:50.42644002 +0000 UTC m=+65.990131290" watchObservedRunningTime="2026-04-20 13:31:50.428000157 +0000 UTC m=+65.991691418" Apr 20 13:31:52.560164 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.560024 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh"] Apr 20 13:31:52.563252 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.563225 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b"] Apr 20 13:31:52.563440 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.563420 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.567751 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.566824 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 13:31:52.567751 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.567125 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 13:31:52.567751 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.567330 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 13:31:52.567751 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.567572 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 13:31:52.570329 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.568265 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.571814 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.571191 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 13:31:52.571814 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.571344 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 13:31:52.572385 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.571999 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 13:31:52.581301 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.578703 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 13:31:52.582545 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.581654 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh"] Apr 20 13:31:52.582545 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.581685 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b"] Apr 20 13:31:52.588217 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.588181 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7f94986b8b-ms7wn"] Apr 20 13:31:52.588798 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:31:52.588770 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" podUID="0308410e-09ea-4fe7-82af-d5ff8756db17" Apr 20 13:31:52.612919 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.612879 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq2fv\" (UniqueName: \"kubernetes.io/projected/53b09754-b5e3-491e-b440-2a6715d8ba6c-kube-api-access-gq2fv\") pod \"klusterlet-addon-workmgr-5cbc5b474b-d42jh\" (UID: \"53b09754-b5e3-491e-b440-2a6715d8ba6c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.613005 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.612937 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/53b09754-b5e3-491e-b440-2a6715d8ba6c-klusterlet-config\") pod \"klusterlet-addon-workmgr-5cbc5b474b-d42jh\" (UID: \"53b09754-b5e3-491e-b440-2a6715d8ba6c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.613005 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.612976 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53b09754-b5e3-491e-b440-2a6715d8ba6c-tmp\") pod \"klusterlet-addon-workmgr-5cbc5b474b-d42jh\" (UID: \"53b09754-b5e3-491e-b440-2a6715d8ba6c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.679017 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.678982 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5844f64979-4t7rs"] Apr 20 13:31:52.681338 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.681315 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.697383 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.697360 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5844f64979-4t7rs"] Apr 20 13:31:52.714164 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.714134 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9mkn\" (UniqueName: \"kubernetes.io/projected/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-kube-api-access-z9mkn\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.714331 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.714268 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-hub\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.714399 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.714358 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.714497 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.714482 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.714600 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.714586 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-ca\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.714757 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.714739 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gq2fv\" (UniqueName: \"kubernetes.io/projected/53b09754-b5e3-491e-b440-2a6715d8ba6c-kube-api-access-gq2fv\") pod \"klusterlet-addon-workmgr-5cbc5b474b-d42jh\" (UID: \"53b09754-b5e3-491e-b440-2a6715d8ba6c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.714979 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.714956 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/53b09754-b5e3-491e-b440-2a6715d8ba6c-klusterlet-config\") pod \"klusterlet-addon-workmgr-5cbc5b474b-d42jh\" (UID: \"53b09754-b5e3-491e-b440-2a6715d8ba6c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.715144 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.715124 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.715257 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.715166 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53b09754-b5e3-491e-b440-2a6715d8ba6c-tmp\") pod \"klusterlet-addon-workmgr-5cbc5b474b-d42jh\" (UID: \"53b09754-b5e3-491e-b440-2a6715d8ba6c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.715566 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.715546 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/53b09754-b5e3-491e-b440-2a6715d8ba6c-tmp\") pod \"klusterlet-addon-workmgr-5cbc5b474b-d42jh\" (UID: \"53b09754-b5e3-491e-b440-2a6715d8ba6c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.717924 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.717904 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/53b09754-b5e3-491e-b440-2a6715d8ba6c-klusterlet-config\") pod \"klusterlet-addon-workmgr-5cbc5b474b-d42jh\" (UID: \"53b09754-b5e3-491e-b440-2a6715d8ba6c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.744909 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.744870 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq2fv\" (UniqueName: \"kubernetes.io/projected/53b09754-b5e3-491e-b440-2a6715d8ba6c-kube-api-access-gq2fv\") pod \"klusterlet-addon-workmgr-5cbc5b474b-d42jh\" (UID: \"53b09754-b5e3-491e-b440-2a6715d8ba6c\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.816395 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816351 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74c7d945-ac32-4efe-ba31-a3bd50d2d706-ca-trust-extracted\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.816395 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816392 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9mkn\" (UniqueName: \"kubernetes.io/projected/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-kube-api-access-z9mkn\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.816610 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816466 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-hub\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.816610 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816493 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.816610 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816520 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm8hv\" (UniqueName: \"kubernetes.io/projected/74c7d945-ac32-4efe-ba31-a3bd50d2d706-kube-api-access-pm8hv\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.816610 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816543 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74c7d945-ac32-4efe-ba31-a3bd50d2d706-image-registry-private-configuration\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.816610 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816568 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.816888 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816730 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-ca\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.816888 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816808 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74c7d945-ac32-4efe-ba31-a3bd50d2d706-installation-pull-secrets\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.816888 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816852 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.817025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816891 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74c7d945-ac32-4efe-ba31-a3bd50d2d706-registry-certificates\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.817025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816919 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74c7d945-ac32-4efe-ba31-a3bd50d2d706-registry-tls\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.817025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.816944 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74c7d945-ac32-4efe-ba31-a3bd50d2d706-trusted-ca\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.817025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.817004 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74c7d945-ac32-4efe-ba31-a3bd50d2d706-bound-sa-token\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.817418 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.817315 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.819261 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.819238 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-ca\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.819378 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.819344 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.819433 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.819376 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-hub\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.819515 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.819499 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.829622 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.829602 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9mkn\" (UniqueName: \"kubernetes.io/projected/96ec9a85-ce5b-4fd3-81be-e6c048db4fdb-kube-api-access-z9mkn\") pod \"cluster-proxy-proxy-agent-7c65494d56-lv55b\" (UID: \"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.881988 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.881910 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:52.901802 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.901767 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" Apr 20 13:31:52.917844 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.917815 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74c7d945-ac32-4efe-ba31-a3bd50d2d706-ca-trust-extracted\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.917965 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.917863 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm8hv\" (UniqueName: \"kubernetes.io/projected/74c7d945-ac32-4efe-ba31-a3bd50d2d706-kube-api-access-pm8hv\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.917965 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.917888 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74c7d945-ac32-4efe-ba31-a3bd50d2d706-image-registry-private-configuration\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.917965 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.917939 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74c7d945-ac32-4efe-ba31-a3bd50d2d706-installation-pull-secrets\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.918160 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.917978 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74c7d945-ac32-4efe-ba31-a3bd50d2d706-registry-certificates\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.918160 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.918004 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74c7d945-ac32-4efe-ba31-a3bd50d2d706-registry-tls\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.918160 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.918024 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74c7d945-ac32-4efe-ba31-a3bd50d2d706-trusted-ca\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.918160 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.918091 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74c7d945-ac32-4efe-ba31-a3bd50d2d706-bound-sa-token\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.918345 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.918317 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/74c7d945-ac32-4efe-ba31-a3bd50d2d706-ca-trust-extracted\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.919113 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.919087 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/74c7d945-ac32-4efe-ba31-a3bd50d2d706-registry-certificates\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.919472 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.919425 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/74c7d945-ac32-4efe-ba31-a3bd50d2d706-trusted-ca\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.921478 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.921448 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/74c7d945-ac32-4efe-ba31-a3bd50d2d706-image-registry-private-configuration\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.921582 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.921517 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/74c7d945-ac32-4efe-ba31-a3bd50d2d706-installation-pull-secrets\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.921626 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.921584 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/74c7d945-ac32-4efe-ba31-a3bd50d2d706-registry-tls\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.933578 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.933550 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm8hv\" (UniqueName: \"kubernetes.io/projected/74c7d945-ac32-4efe-ba31-a3bd50d2d706-kube-api-access-pm8hv\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.934351 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.934325 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/74c7d945-ac32-4efe-ba31-a3bd50d2d706-bound-sa-token\") pod \"image-registry-5844f64979-4t7rs\" (UID: \"74c7d945-ac32-4efe-ba31-a3bd50d2d706\") " pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:52.995427 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:52.994652 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xw967\"" Apr 20 13:31:53.001657 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.001616 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:53.014437 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.014402 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh"] Apr 20 13:31:53.018140 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:53.018104 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53b09754_b5e3_491e_b440_2a6715d8ba6c.slice/crio-ae69b729a22ce9388d740353079f56d8fb7e7dbfc6083174c75506ae7aed79b9 WatchSource:0}: Error finding container ae69b729a22ce9388d740353079f56d8fb7e7dbfc6083174c75506ae7aed79b9: Status 404 returned error can't find the container with id ae69b729a22ce9388d740353079f56d8fb7e7dbfc6083174c75506ae7aed79b9 Apr 20 13:31:53.049086 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.049041 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b"] Apr 20 13:31:53.051745 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:53.051715 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96ec9a85_ce5b_4fd3_81be_e6c048db4fdb.slice/crio-cd709cb235cfa429f49de08c08cd07cf5a2a5278b95f3ad88311ec94e0e3678f WatchSource:0}: Error finding container cd709cb235cfa429f49de08c08cd07cf5a2a5278b95f3ad88311ec94e0e3678f: Status 404 returned error can't find the container with id cd709cb235cfa429f49de08c08cd07cf5a2a5278b95f3ad88311ec94e0e3678f Apr 20 13:31:53.166696 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.166669 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5844f64979-4t7rs"] Apr 20 13:31:53.168379 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:31:53.168354 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c7d945_ac32_4efe_ba31_a3bd50d2d706.slice/crio-893eed8379d18620feed12a0055da39e35f0869c2ce251c334e6e34431a17748 WatchSource:0}: Error finding container 893eed8379d18620feed12a0055da39e35f0869c2ce251c334e6e34431a17748: Status 404 returned error can't find the container with id 893eed8379d18620feed12a0055da39e35f0869c2ce251c334e6e34431a17748 Apr 20 13:31:53.416884 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.416785 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jq28g" event={"ID":"5fd3c7eb-0c40-4911-a067-10cda31de0d7","Type":"ContainerStarted","Data":"6581c6878e76e452cc2e6d6a9805c86f0cff7eecca22c43a510fbd95f52d3243"} Apr 20 13:31:53.416884 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.416836 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jq28g" event={"ID":"5fd3c7eb-0c40-4911-a067-10cda31de0d7","Type":"ContainerStarted","Data":"4c3934c34efd0611b9efaac5af608b2e7e38358db08ad509998ba2a964d0abbc"} Apr 20 13:31:53.417247 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.416946 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jq28g" Apr 20 13:31:53.418393 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.418368 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5pkrd" event={"ID":"02041a2f-e9fd-4902-a9a4-47e4cd2889e4","Type":"ContainerStarted","Data":"cb712fa948a09f510e1a926d45ebe141d3a1bbe263840b565b182962495f4a9e"} Apr 20 13:31:53.418504 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.418401 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-5pkrd" event={"ID":"02041a2f-e9fd-4902-a9a4-47e4cd2889e4","Type":"ContainerStarted","Data":"463cbd8b2844f1d21ca7e7add02170745460123ab4b10f9c0e92d9010f483590"} Apr 20 13:31:53.419459 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.419434 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" event={"ID":"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb","Type":"ContainerStarted","Data":"cd709cb235cfa429f49de08c08cd07cf5a2a5278b95f3ad88311ec94e0e3678f"} Apr 20 13:31:53.420674 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.420650 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lw64p" event={"ID":"8c0036f9-a08e-4eac-8ad7-301fe4765604","Type":"ContainerStarted","Data":"9cea4ea28e5a843771ddae2cbae5cde0e0953adad683d85c2f2f148af2d21cde"} Apr 20 13:31:53.422195 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.422171 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5844f64979-4t7rs" event={"ID":"74c7d945-ac32-4efe-ba31-a3bd50d2d706","Type":"ContainerStarted","Data":"7efa9003e8747ecf9fd3be889c19e0aae021510f8866e5242c30af4417d7e8f9"} Apr 20 13:31:53.422320 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.422202 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5844f64979-4t7rs" event={"ID":"74c7d945-ac32-4efe-ba31-a3bd50d2d706","Type":"ContainerStarted","Data":"893eed8379d18620feed12a0055da39e35f0869c2ce251c334e6e34431a17748"} Apr 20 13:31:53.422320 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.422252 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:31:53.423885 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.423852 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" event={"ID":"53b09754-b5e3-491e-b440-2a6715d8ba6c","Type":"ContainerStarted","Data":"ae69b729a22ce9388d740353079f56d8fb7e7dbfc6083174c75506ae7aed79b9"} Apr 20 13:31:53.424472 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.424450 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:53.430038 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.430017 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:53.436535 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.436476 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jq28g" podStartSLOduration=34.285409927 podStartE2EDuration="36.436459399s" podCreationTimestamp="2026-04-20 13:31:17 +0000 UTC" firstStartedPulling="2026-04-20 13:31:50.214362013 +0000 UTC m=+65.778053261" lastFinishedPulling="2026-04-20 13:31:52.365411471 +0000 UTC m=+67.929102733" observedRunningTime="2026-04-20 13:31:53.435122516 +0000 UTC m=+68.998813788" watchObservedRunningTime="2026-04-20 13:31:53.436459399 +0000 UTC m=+69.000150670" Apr 20 13:31:53.453495 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.453439 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lw64p" podStartSLOduration=34.276109534 podStartE2EDuration="36.453422827s" podCreationTimestamp="2026-04-20 13:31:17 +0000 UTC" firstStartedPulling="2026-04-20 13:31:50.190484681 +0000 UTC m=+65.754175932" lastFinishedPulling="2026-04-20 13:31:52.367797976 +0000 UTC m=+67.931489225" observedRunningTime="2026-04-20 13:31:53.451727745 +0000 UTC m=+69.015419026" watchObservedRunningTime="2026-04-20 13:31:53.453422827 +0000 UTC m=+69.017114095" Apr 20 13:31:53.472391 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.472344 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-5pkrd" podStartSLOduration=66.138902168 podStartE2EDuration="1m8.4723304s" podCreationTimestamp="2026-04-20 13:30:45 +0000 UTC" firstStartedPulling="2026-04-20 13:31:50.027108342 +0000 UTC m=+65.590799594" lastFinishedPulling="2026-04-20 13:31:52.360536575 +0000 UTC m=+67.924227826" observedRunningTime="2026-04-20 13:31:53.471773574 +0000 UTC m=+69.035464843" watchObservedRunningTime="2026-04-20 13:31:53.4723304 +0000 UTC m=+69.036021721" Apr 20 13:31:53.502560 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.502503 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5844f64979-4t7rs" podStartSLOduration=1.502488426 podStartE2EDuration="1.502488426s" podCreationTimestamp="2026-04-20 13:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 13:31:53.501659658 +0000 UTC m=+69.065350940" watchObservedRunningTime="2026-04-20 13:31:53.502488426 +0000 UTC m=+69.066179690" Apr 20 13:31:53.523492 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.523462 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0308410e-09ea-4fe7-82af-d5ff8756db17-ca-trust-extracted\") pod \"0308410e-09ea-4fe7-82af-d5ff8756db17\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " Apr 20 13:31:53.523492 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.523498 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-bound-sa-token\") pod \"0308410e-09ea-4fe7-82af-d5ff8756db17\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " Apr 20 13:31:53.523725 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.523536 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9s6x\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-kube-api-access-w9s6x\") pod \"0308410e-09ea-4fe7-82af-d5ff8756db17\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " Apr 20 13:31:53.523725 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.523559 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-image-registry-private-configuration\") pod \"0308410e-09ea-4fe7-82af-d5ff8756db17\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " Apr 20 13:31:53.523725 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.523592 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-trusted-ca\") pod \"0308410e-09ea-4fe7-82af-d5ff8756db17\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " Apr 20 13:31:53.523725 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.523616 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-certificates\") pod \"0308410e-09ea-4fe7-82af-d5ff8756db17\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " Apr 20 13:31:53.523725 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.523647 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-installation-pull-secrets\") pod \"0308410e-09ea-4fe7-82af-d5ff8756db17\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " Apr 20 13:31:53.523956 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.523733 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0308410e-09ea-4fe7-82af-d5ff8756db17-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0308410e-09ea-4fe7-82af-d5ff8756db17" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 13:31:53.524043 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.524000 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0308410e-09ea-4fe7-82af-d5ff8756db17" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:31:53.525441 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.524600 2563 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-certificates\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:31:53.525441 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.524626 2563 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0308410e-09ea-4fe7-82af-d5ff8756db17-ca-trust-extracted\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:31:53.525441 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.525379 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0308410e-09ea-4fe7-82af-d5ff8756db17" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:31:53.526587 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.526560 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-kube-api-access-w9s6x" (OuterVolumeSpecName: "kube-api-access-w9s6x") pod "0308410e-09ea-4fe7-82af-d5ff8756db17" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17"). InnerVolumeSpecName "kube-api-access-w9s6x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:31:53.527100 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.527030 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0308410e-09ea-4fe7-82af-d5ff8756db17" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:31:53.527205 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.527163 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0308410e-09ea-4fe7-82af-d5ff8756db17" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:31:53.527554 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.527534 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0308410e-09ea-4fe7-82af-d5ff8756db17" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:31:53.625916 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.625875 2563 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-bound-sa-token\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:31:53.625916 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.625919 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w9s6x\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-kube-api-access-w9s6x\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:31:53.626419 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.625933 2563 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-image-registry-private-configuration\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:31:53.626419 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.625946 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0308410e-09ea-4fe7-82af-d5ff8756db17-trusted-ca\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:31:53.626419 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:53.625961 2563 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0308410e-09ea-4fe7-82af-d5ff8756db17-installation-pull-secrets\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:31:54.427465 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:54.427435 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:54.475912 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:54.475875 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7f94986b8b-ms7wn"] Apr 20 13:31:54.482915 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:54.482881 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7f94986b8b-ms7wn"] Apr 20 13:31:54.535203 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:54.535157 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:54.540443 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:54.540415 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") pod \"image-registry-7f94986b8b-ms7wn\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " pod="openshift-image-registry/image-registry-7f94986b8b-ms7wn" Apr 20 13:31:54.636633 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:54.636596 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") pod \"0308410e-09ea-4fe7-82af-d5ff8756db17\" (UID: \"0308410e-09ea-4fe7-82af-d5ff8756db17\") " Apr 20 13:31:54.639398 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:54.639356 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0308410e-09ea-4fe7-82af-d5ff8756db17" (UID: "0308410e-09ea-4fe7-82af-d5ff8756db17"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:31:54.738006 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:54.737917 2563 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0308410e-09ea-4fe7-82af-d5ff8756db17-registry-tls\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:31:55.068166 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:55.068125 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0308410e-09ea-4fe7-82af-d5ff8756db17" path="/var/lib/kubelet/pods/0308410e-09ea-4fe7-82af-d5ff8756db17/volumes" Apr 20 13:31:56.434340 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:56.434260 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" event={"ID":"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb","Type":"ContainerStarted","Data":"2d069b254e701875b79b7783e1eb2728e8a8868a87941b1dd868dccfc338fda0"} Apr 20 13:31:59.445483 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:59.445442 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" event={"ID":"53b09754-b5e3-491e-b440-2a6715d8ba6c","Type":"ContainerStarted","Data":"465269c665c1568c8e3437321882564683c25b65f56a0d43fbe0e7e3512f37ab"} Apr 20 13:31:59.445908 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:59.445790 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:59.447433 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:59.447409 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" Apr 20 13:31:59.484360 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:31:59.484301 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5cbc5b474b-d42jh" podStartSLOduration=2.174440907 podStartE2EDuration="7.484284774s" podCreationTimestamp="2026-04-20 13:31:52 +0000 UTC" firstStartedPulling="2026-04-20 13:31:53.020316731 +0000 UTC m=+68.584007983" lastFinishedPulling="2026-04-20 13:31:58.330160587 +0000 UTC m=+73.893851850" observedRunningTime="2026-04-20 13:31:59.465457013 +0000 UTC m=+75.029148283" watchObservedRunningTime="2026-04-20 13:31:59.484284774 +0000 UTC m=+75.047976043" Apr 20 13:32:00.449732 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:00.449707 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" event={"ID":"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb","Type":"ContainerStarted","Data":"e4873dec1ab8ce66afe56488a910f96688895535852d9df2a41c0e7683b10b24"} Apr 20 13:32:01.454331 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:01.454295 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" event={"ID":"96ec9a85-ce5b-4fd3-81be-e6c048db4fdb","Type":"ContainerStarted","Data":"06160163eee9b64af3f6cef797555600d2390e534d1736b59cd777a39d67c46a"} Apr 20 13:32:01.475197 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:01.475143 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7c65494d56-lv55b" podStartSLOduration=2.188107684 podStartE2EDuration="9.475123189s" podCreationTimestamp="2026-04-20 13:31:52 +0000 UTC" firstStartedPulling="2026-04-20 13:31:53.053555975 +0000 UTC m=+68.617247239" lastFinishedPulling="2026-04-20 13:32:00.340571496 +0000 UTC m=+75.904262744" observedRunningTime="2026-04-20 13:32:01.473771443 +0000 UTC m=+77.037462714" watchObservedRunningTime="2026-04-20 13:32:01.475123189 +0000 UTC m=+77.038814460" Apr 20 13:32:03.429747 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:03.429710 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jq28g" Apr 20 13:32:09.909264 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:09.909230 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wxck5"] Apr 20 13:32:09.914215 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:09.914192 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:09.916437 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:09.916413 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 13:32:09.916645 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:09.916622 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 13:32:09.917659 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:09.917641 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 13:32:09.917801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:09.917659 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-dzwpq\"" Apr 20 13:32:09.917801 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:09.917659 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 13:32:09.917960 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:09.917670 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 13:32:09.917960 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:09.917673 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 13:32:10.062020 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.061980 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8055fd5-cb97-495a-88d8-f9b6239b99f3-root\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.062020 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.062018 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-textfile\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.062291 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.062042 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.062291 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.062076 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-tls\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.062291 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.062145 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-wtmp\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.062291 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.062206 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8055fd5-cb97-495a-88d8-f9b6239b99f3-metrics-client-ca\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.062291 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.062239 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.062291 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.062278 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8055fd5-cb97-495a-88d8-f9b6239b99f3-sys\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.062597 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.062334 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sg47\" (UniqueName: \"kubernetes.io/projected/d8055fd5-cb97-495a-88d8-f9b6239b99f3-kube-api-access-9sg47\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.163402 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.163316 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8055fd5-cb97-495a-88d8-f9b6239b99f3-sys\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.163402 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.163372 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sg47\" (UniqueName: \"kubernetes.io/projected/d8055fd5-cb97-495a-88d8-f9b6239b99f3-kube-api-access-9sg47\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.163589 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.163446 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d8055fd5-cb97-495a-88d8-f9b6239b99f3-sys\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.163589 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.163502 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8055fd5-cb97-495a-88d8-f9b6239b99f3-root\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.163589 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.163537 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-textfile\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.163589 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.163568 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.163781 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.163600 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-tls\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.163781 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.163627 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d8055fd5-cb97-495a-88d8-f9b6239b99f3-root\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.163781 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:32:10.163699 2563 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 13:32:10.163932 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:32:10.163787 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-tls podName:d8055fd5-cb97-495a-88d8-f9b6239b99f3 nodeName:}" failed. No retries permitted until 2026-04-20 13:32:10.663766033 +0000 UTC m=+86.227457288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-tls") pod "node-exporter-wxck5" (UID: "d8055fd5-cb97-495a-88d8-f9b6239b99f3") : secret "node-exporter-tls" not found Apr 20 13:32:10.163932 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.163817 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-wtmp\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.164081 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.164032 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-textfile\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.164174 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.164153 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8055fd5-cb97-495a-88d8-f9b6239b99f3-metrics-client-ca\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.164230 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.164200 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.164570 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.164504 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8055fd5-cb97-495a-88d8-f9b6239b99f3-metrics-client-ca\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.164694 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.164629 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-wtmp\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.164694 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.164665 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-accelerators-collector-config\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.166392 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.166375 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.173013 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.172982 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sg47\" (UniqueName: \"kubernetes.io/projected/d8055fd5-cb97-495a-88d8-f9b6239b99f3-kube-api-access-9sg47\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.667409 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.667368 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-tls\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.669962 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.669934 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d8055fd5-cb97-495a-88d8-f9b6239b99f3-node-exporter-tls\") pod \"node-exporter-wxck5\" (UID: \"d8055fd5-cb97-495a-88d8-f9b6239b99f3\") " pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.823403 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:10.823368 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wxck5" Apr 20 13:32:10.833125 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:32:10.833088 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8055fd5_cb97_495a_88d8_f9b6239b99f3.slice/crio-a36945832b80dbf040a228d2b5c81008663f796563f1aff02cf838206667c44d WatchSource:0}: Error finding container a36945832b80dbf040a228d2b5c81008663f796563f1aff02cf838206667c44d: Status 404 returned error can't find the container with id a36945832b80dbf040a228d2b5c81008663f796563f1aff02cf838206667c44d Apr 20 13:32:11.483007 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:11.482966 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wxck5" event={"ID":"d8055fd5-cb97-495a-88d8-f9b6239b99f3","Type":"ContainerStarted","Data":"a36945832b80dbf040a228d2b5c81008663f796563f1aff02cf838206667c44d"} Apr 20 13:32:12.487024 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:12.486984 2563 generic.go:358] "Generic (PLEG): container finished" podID="d8055fd5-cb97-495a-88d8-f9b6239b99f3" containerID="997bdf21b63e28777cf282298fca207b9a6cbcc70d278511d72df06fae51fc0d" exitCode=0 Apr 20 13:32:12.487437 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:12.487076 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wxck5" event={"ID":"d8055fd5-cb97-495a-88d8-f9b6239b99f3","Type":"ContainerDied","Data":"997bdf21b63e28777cf282298fca207b9a6cbcc70d278511d72df06fae51fc0d"} Apr 20 13:32:13.491821 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:13.491783 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wxck5" event={"ID":"d8055fd5-cb97-495a-88d8-f9b6239b99f3","Type":"ContainerStarted","Data":"399f732b52e414f7c8ba8aa8cd27b884a7b8806b3e38b28a56fea37c2fb342b8"} Apr 20 13:32:13.491821 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:13.491825 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wxck5" event={"ID":"d8055fd5-cb97-495a-88d8-f9b6239b99f3","Type":"ContainerStarted","Data":"b6aad5ed1574d89c4615b1cf9bbac4370552cc74e75bd0ddeba44a23a9a05c69"} Apr 20 13:32:13.515850 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:13.515795 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wxck5" podStartSLOduration=3.658020833 podStartE2EDuration="4.515777585s" podCreationTimestamp="2026-04-20 13:32:09 +0000 UTC" firstStartedPulling="2026-04-20 13:32:10.835185271 +0000 UTC m=+86.398876536" lastFinishedPulling="2026-04-20 13:32:11.69294202 +0000 UTC m=+87.256633288" observedRunningTime="2026-04-20 13:32:13.514717214 +0000 UTC m=+89.078408479" watchObservedRunningTime="2026-04-20 13:32:13.515777585 +0000 UTC m=+89.079468855" Apr 20 13:32:14.433234 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:14.433207 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5844f64979-4t7rs" Apr 20 13:32:21.409996 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:21.409967 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-x6gsn" Apr 20 13:32:40.570553 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:40.570519 2563 generic.go:358] "Generic (PLEG): container finished" podID="5f9cc52f-4998-4934-a999-16ee91bf3d4a" containerID="b4f5b4ff684648b586f778b562940e381f0a293822c8cf436a332232dbb3126b" exitCode=0 Apr 20 13:32:40.570969 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:40.570604 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qrddw" event={"ID":"5f9cc52f-4998-4934-a999-16ee91bf3d4a","Type":"ContainerDied","Data":"b4f5b4ff684648b586f778b562940e381f0a293822c8cf436a332232dbb3126b"} Apr 20 13:32:40.570969 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:40.570939 2563 scope.go:117] "RemoveContainer" containerID="b4f5b4ff684648b586f778b562940e381f0a293822c8cf436a332232dbb3126b" Apr 20 13:32:41.575188 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:41.575145 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-qrddw" event={"ID":"5f9cc52f-4998-4934-a999-16ee91bf3d4a","Type":"ContainerStarted","Data":"ef87467ff463c7764c8cd2911eaeb0cb2de18c8dbe8094e2b4e4305fc1052158"} Apr 20 13:32:44.983823 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:44.983789 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wxck5_d8055fd5-cb97-495a-88d8-f9b6239b99f3/init-textfile/0.log" Apr 20 13:32:45.184699 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:45.184673 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wxck5_d8055fd5-cb97-495a-88d8-f9b6239b99f3/node-exporter/0.log" Apr 20 13:32:45.388552 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:45.388522 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wxck5_d8055fd5-cb97-495a-88d8-f9b6239b99f3/kube-rbac-proxy/0.log" Apr 20 13:32:50.601954 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:50.601920 2563 generic.go:358] "Generic (PLEG): container finished" podID="4f159e37-18ec-4deb-a9fa-9b41fad19818" containerID="fa2ca5fc85d16b6c0f68403d04a69b0feb1149fbfb3c16f23cefd710eaa38d6e" exitCode=0 Apr 20 13:32:50.602354 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:50.601967 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" event={"ID":"4f159e37-18ec-4deb-a9fa-9b41fad19818","Type":"ContainerDied","Data":"fa2ca5fc85d16b6c0f68403d04a69b0feb1149fbfb3c16f23cefd710eaa38d6e"} Apr 20 13:32:50.602354 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:50.602258 2563 scope.go:117] "RemoveContainer" containerID="fa2ca5fc85d16b6c0f68403d04a69b0feb1149fbfb3c16f23cefd710eaa38d6e" Apr 20 13:32:51.606919 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:32:51.606883 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-lrfsk" event={"ID":"4f159e37-18ec-4deb-a9fa-9b41fad19818","Type":"ContainerStarted","Data":"25913baed43f86851ee79034feea7fcf41baca4dd5c4f13d593b69f1d52654bc"} Apr 20 13:35:44.953432 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:35:44.953405 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:35:44.953924 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:35:44.953405 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:35:44.959808 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:35:44.959783 2563 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 13:37:10.309922 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.309687 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc"] Apr 20 13:37:10.312904 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.312884 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.319140 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.319107 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 13:37:10.319290 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.319204 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 13:37:10.319290 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.319220 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 13:37:10.319290 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.319282 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 13:37:10.319466 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.319450 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-zmf5m\"" Apr 20 13:37:10.328471 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.328438 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc"] Apr 20 13:37:10.385365 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.385322 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c062ebea-ac1f-4302-8c43-6f2082de5b19-webhook-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-ms4qc\" (UID: \"c062ebea-ac1f-4302-8c43-6f2082de5b19\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.385528 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.385383 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c062ebea-ac1f-4302-8c43-6f2082de5b19-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-ms4qc\" (UID: \"c062ebea-ac1f-4302-8c43-6f2082de5b19\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.385528 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.385471 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-648nv\" (UniqueName: \"kubernetes.io/projected/c062ebea-ac1f-4302-8c43-6f2082de5b19-kube-api-access-648nv\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-ms4qc\" (UID: \"c062ebea-ac1f-4302-8c43-6f2082de5b19\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.486061 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.486022 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-648nv\" (UniqueName: \"kubernetes.io/projected/c062ebea-ac1f-4302-8c43-6f2082de5b19-kube-api-access-648nv\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-ms4qc\" (UID: \"c062ebea-ac1f-4302-8c43-6f2082de5b19\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.486242 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.486105 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c062ebea-ac1f-4302-8c43-6f2082de5b19-webhook-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-ms4qc\" (UID: \"c062ebea-ac1f-4302-8c43-6f2082de5b19\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.486242 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.486144 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c062ebea-ac1f-4302-8c43-6f2082de5b19-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-ms4qc\" (UID: \"c062ebea-ac1f-4302-8c43-6f2082de5b19\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.488803 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.488779 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c062ebea-ac1f-4302-8c43-6f2082de5b19-webhook-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-ms4qc\" (UID: \"c062ebea-ac1f-4302-8c43-6f2082de5b19\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.488902 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.488882 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c062ebea-ac1f-4302-8c43-6f2082de5b19-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-ms4qc\" (UID: \"c062ebea-ac1f-4302-8c43-6f2082de5b19\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.508219 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.508187 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-648nv\" (UniqueName: \"kubernetes.io/projected/c062ebea-ac1f-4302-8c43-6f2082de5b19-kube-api-access-648nv\") pod \"opendatahub-operator-controller-manager-8cd4c57cb-ms4qc\" (UID: \"c062ebea-ac1f-4302-8c43-6f2082de5b19\") " pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.627587 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.627502 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:10.759639 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.759614 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc"] Apr 20 13:37:10.763139 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:37:10.763103 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc062ebea_ac1f_4302_8c43_6f2082de5b19.slice/crio-9efbf7a8403ce5e1c04e93a47d427264faa9a722028c34b850399d2091031a7f WatchSource:0}: Error finding container 9efbf7a8403ce5e1c04e93a47d427264faa9a722028c34b850399d2091031a7f: Status 404 returned error can't find the container with id 9efbf7a8403ce5e1c04e93a47d427264faa9a722028c34b850399d2091031a7f Apr 20 13:37:10.764808 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:10.764789 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 13:37:11.295957 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:11.295922 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" event={"ID":"c062ebea-ac1f-4302-8c43-6f2082de5b19","Type":"ContainerStarted","Data":"9efbf7a8403ce5e1c04e93a47d427264faa9a722028c34b850399d2091031a7f"} Apr 20 13:37:14.306090 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:14.306035 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" event={"ID":"c062ebea-ac1f-4302-8c43-6f2082de5b19","Type":"ContainerStarted","Data":"29dafbcb896c522191919feb962d754be2d2ebcf28468bc5c718537c15f54982"} Apr 20 13:37:14.306478 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:14.306169 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:14.329022 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:14.328971 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" podStartSLOduration=1.8416290800000001 podStartE2EDuration="4.328955304s" podCreationTimestamp="2026-04-20 13:37:10 +0000 UTC" firstStartedPulling="2026-04-20 13:37:10.764925001 +0000 UTC m=+386.328616249" lastFinishedPulling="2026-04-20 13:37:13.252251219 +0000 UTC m=+388.815942473" observedRunningTime="2026-04-20 13:37:14.328361987 +0000 UTC m=+389.892053256" watchObservedRunningTime="2026-04-20 13:37:14.328955304 +0000 UTC m=+389.892646575" Apr 20 13:37:25.310247 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:25.310214 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-8cd4c57cb-ms4qc" Apr 20 13:37:31.981816 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:31.981782 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-scs84"] Apr 20 13:37:31.986450 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:31.986430 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:31.988835 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:31.988801 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-mxv6n\"" Apr 20 13:37:31.988835 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:31.988801 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 13:37:31.994508 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:31.994485 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-scs84"] Apr 20 13:37:32.060432 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:32.060396 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/090657bd-bd1c-4a9c-965a-1e6543166a2a-cert\") pod \"odh-model-controller-858dbf95b8-scs84\" (UID: \"090657bd-bd1c-4a9c-965a-1e6543166a2a\") " pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:32.060625 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:32.060522 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r9fc\" (UniqueName: \"kubernetes.io/projected/090657bd-bd1c-4a9c-965a-1e6543166a2a-kube-api-access-4r9fc\") pod \"odh-model-controller-858dbf95b8-scs84\" (UID: \"090657bd-bd1c-4a9c-965a-1e6543166a2a\") " pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:32.161817 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:32.161778 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4r9fc\" (UniqueName: \"kubernetes.io/projected/090657bd-bd1c-4a9c-965a-1e6543166a2a-kube-api-access-4r9fc\") pod \"odh-model-controller-858dbf95b8-scs84\" (UID: \"090657bd-bd1c-4a9c-965a-1e6543166a2a\") " pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:32.162024 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:32.161828 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/090657bd-bd1c-4a9c-965a-1e6543166a2a-cert\") pod \"odh-model-controller-858dbf95b8-scs84\" (UID: \"090657bd-bd1c-4a9c-965a-1e6543166a2a\") " pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:32.162024 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:37:32.161925 2563 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 13:37:32.162024 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:37:32.161989 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090657bd-bd1c-4a9c-965a-1e6543166a2a-cert podName:090657bd-bd1c-4a9c-965a-1e6543166a2a nodeName:}" failed. No retries permitted until 2026-04-20 13:37:32.661972761 +0000 UTC m=+408.225664009 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/090657bd-bd1c-4a9c-965a-1e6543166a2a-cert") pod "odh-model-controller-858dbf95b8-scs84" (UID: "090657bd-bd1c-4a9c-965a-1e6543166a2a") : secret "odh-model-controller-webhook-cert" not found Apr 20 13:37:32.171528 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:32.171504 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r9fc\" (UniqueName: \"kubernetes.io/projected/090657bd-bd1c-4a9c-965a-1e6543166a2a-kube-api-access-4r9fc\") pod \"odh-model-controller-858dbf95b8-scs84\" (UID: \"090657bd-bd1c-4a9c-965a-1e6543166a2a\") " pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:32.665729 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:32.665694 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/090657bd-bd1c-4a9c-965a-1e6543166a2a-cert\") pod \"odh-model-controller-858dbf95b8-scs84\" (UID: \"090657bd-bd1c-4a9c-965a-1e6543166a2a\") " pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:32.665934 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:37:32.665848 2563 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 13:37:32.665934 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:37:32.665917 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/090657bd-bd1c-4a9c-965a-1e6543166a2a-cert podName:090657bd-bd1c-4a9c-965a-1e6543166a2a nodeName:}" failed. No retries permitted until 2026-04-20 13:37:33.665902297 +0000 UTC m=+409.229593545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/090657bd-bd1c-4a9c-965a-1e6543166a2a-cert") pod "odh-model-controller-858dbf95b8-scs84" (UID: "090657bd-bd1c-4a9c-965a-1e6543166a2a") : secret "odh-model-controller-webhook-cert" not found Apr 20 13:37:33.675027 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:33.674984 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/090657bd-bd1c-4a9c-965a-1e6543166a2a-cert\") pod \"odh-model-controller-858dbf95b8-scs84\" (UID: \"090657bd-bd1c-4a9c-965a-1e6543166a2a\") " pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:33.677743 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:33.677716 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/090657bd-bd1c-4a9c-965a-1e6543166a2a-cert\") pod \"odh-model-controller-858dbf95b8-scs84\" (UID: \"090657bd-bd1c-4a9c-965a-1e6543166a2a\") " pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:33.796904 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:33.796855 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:33.921295 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:33.921270 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-scs84"] Apr 20 13:37:33.923734 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:37:33.923704 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod090657bd_bd1c_4a9c_965a_1e6543166a2a.slice/crio-7ea1d4d14cffb44be344329bef3124400311b851de7e55e9424358a4fdd425ab WatchSource:0}: Error finding container 7ea1d4d14cffb44be344329bef3124400311b851de7e55e9424358a4fdd425ab: Status 404 returned error can't find the container with id 7ea1d4d14cffb44be344329bef3124400311b851de7e55e9424358a4fdd425ab Apr 20 13:37:34.358197 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:34.358158 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-scs84" event={"ID":"090657bd-bd1c-4a9c-965a-1e6543166a2a","Type":"ContainerStarted","Data":"7ea1d4d14cffb44be344329bef3124400311b851de7e55e9424358a4fdd425ab"} Apr 20 13:37:36.994451 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:36.994412 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-46dfm"] Apr 20 13:37:36.997026 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:36.997003 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:37:36.999457 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:36.999429 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 13:37:36.999457 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:36.999448 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-s22ds\"" Apr 20 13:37:37.005695 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.005663 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-46dfm"] Apr 20 13:37:37.106074 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.106025 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12f63628-da5a-4c24-b955-45c6549f5e08-cert\") pod \"kserve-controller-manager-856948b99f-46dfm\" (UID: \"12f63628-da5a-4c24-b955-45c6549f5e08\") " pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:37:37.106219 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.106165 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6lv5\" (UniqueName: \"kubernetes.io/projected/12f63628-da5a-4c24-b955-45c6549f5e08-kube-api-access-b6lv5\") pod \"kserve-controller-manager-856948b99f-46dfm\" (UID: \"12f63628-da5a-4c24-b955-45c6549f5e08\") " pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:37:37.207579 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.207533 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6lv5\" (UniqueName: \"kubernetes.io/projected/12f63628-da5a-4c24-b955-45c6549f5e08-kube-api-access-b6lv5\") pod \"kserve-controller-manager-856948b99f-46dfm\" (UID: \"12f63628-da5a-4c24-b955-45c6549f5e08\") " pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:37:37.207797 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.207612 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12f63628-da5a-4c24-b955-45c6549f5e08-cert\") pod \"kserve-controller-manager-856948b99f-46dfm\" (UID: \"12f63628-da5a-4c24-b955-45c6549f5e08\") " pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:37:37.207797 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:37:37.207722 2563 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 13:37:37.207797 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:37:37.207776 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12f63628-da5a-4c24-b955-45c6549f5e08-cert podName:12f63628-da5a-4c24-b955-45c6549f5e08 nodeName:}" failed. No retries permitted until 2026-04-20 13:37:37.707759832 +0000 UTC m=+413.271451081 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/12f63628-da5a-4c24-b955-45c6549f5e08-cert") pod "kserve-controller-manager-856948b99f-46dfm" (UID: "12f63628-da5a-4c24-b955-45c6549f5e08") : secret "kserve-webhook-server-cert" not found Apr 20 13:37:37.217229 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.217189 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6lv5\" (UniqueName: \"kubernetes.io/projected/12f63628-da5a-4c24-b955-45c6549f5e08-kube-api-access-b6lv5\") pod \"kserve-controller-manager-856948b99f-46dfm\" (UID: \"12f63628-da5a-4c24-b955-45c6549f5e08\") " pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:37:37.368193 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.368157 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-scs84" event={"ID":"090657bd-bd1c-4a9c-965a-1e6543166a2a","Type":"ContainerStarted","Data":"cad44dbc9f8b4d952786066603ef83b04165dff24542da4b7f2321644fd930b3"} Apr 20 13:37:37.368380 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.368363 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:37.406314 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.406263 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-scs84" podStartSLOduration=3.253745839 podStartE2EDuration="6.406247405s" podCreationTimestamp="2026-04-20 13:37:31 +0000 UTC" firstStartedPulling="2026-04-20 13:37:33.925107228 +0000 UTC m=+409.488798480" lastFinishedPulling="2026-04-20 13:37:37.077608796 +0000 UTC m=+412.641300046" observedRunningTime="2026-04-20 13:37:37.404536759 +0000 UTC m=+412.968228026" watchObservedRunningTime="2026-04-20 13:37:37.406247405 +0000 UTC m=+412.969938675" Apr 20 13:37:37.712408 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.712312 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12f63628-da5a-4c24-b955-45c6549f5e08-cert\") pod \"kserve-controller-manager-856948b99f-46dfm\" (UID: \"12f63628-da5a-4c24-b955-45c6549f5e08\") " pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:37:37.715027 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.714981 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12f63628-da5a-4c24-b955-45c6549f5e08-cert\") pod \"kserve-controller-manager-856948b99f-46dfm\" (UID: \"12f63628-da5a-4c24-b955-45c6549f5e08\") " pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:37:37.912317 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:37.912276 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:37:38.041738 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:38.041710 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-46dfm"] Apr 20 13:37:38.043772 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:37:38.043743 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12f63628_da5a_4c24_b955_45c6549f5e08.slice/crio-c4e1e89c31d360780668672c81cf851f7460d3c0cd17dc7253ed9a4782a5e164 WatchSource:0}: Error finding container c4e1e89c31d360780668672c81cf851f7460d3c0cd17dc7253ed9a4782a5e164: Status 404 returned error can't find the container with id c4e1e89c31d360780668672c81cf851f7460d3c0cd17dc7253ed9a4782a5e164 Apr 20 13:37:38.373678 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:38.373640 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" event={"ID":"12f63628-da5a-4c24-b955-45c6549f5e08","Type":"ContainerStarted","Data":"c4e1e89c31d360780668672c81cf851f7460d3c0cd17dc7253ed9a4782a5e164"} Apr 20 13:37:39.491809 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.491773 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-54c65669-xrgmc"] Apr 20 13:37:39.494004 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.493973 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.497700 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.497678 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 13:37:39.498322 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.498302 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 13:37:39.498803 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.498786 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 13:37:39.498986 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.498967 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 13:37:39.510892 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.510864 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-54c65669-xrgmc"] Apr 20 13:37:39.529543 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.529516 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c21b573-c73e-4d3c-bac7-da1338a4aa40-tls-certs\") pod \"kube-auth-proxy-54c65669-xrgmc\" (UID: \"3c21b573-c73e-4d3c-bac7-da1338a4aa40\") " pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.529698 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.529547 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrqdw\" (UniqueName: \"kubernetes.io/projected/3c21b573-c73e-4d3c-bac7-da1338a4aa40-kube-api-access-mrqdw\") pod \"kube-auth-proxy-54c65669-xrgmc\" (UID: \"3c21b573-c73e-4d3c-bac7-da1338a4aa40\") " pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.529698 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.529586 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c21b573-c73e-4d3c-bac7-da1338a4aa40-tmp\") pod \"kube-auth-proxy-54c65669-xrgmc\" (UID: \"3c21b573-c73e-4d3c-bac7-da1338a4aa40\") " pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.630911 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.630872 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c21b573-c73e-4d3c-bac7-da1338a4aa40-tls-certs\") pod \"kube-auth-proxy-54c65669-xrgmc\" (UID: \"3c21b573-c73e-4d3c-bac7-da1338a4aa40\") " pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.630911 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.630911 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrqdw\" (UniqueName: \"kubernetes.io/projected/3c21b573-c73e-4d3c-bac7-da1338a4aa40-kube-api-access-mrqdw\") pod \"kube-auth-proxy-54c65669-xrgmc\" (UID: \"3c21b573-c73e-4d3c-bac7-da1338a4aa40\") " pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.631174 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.630952 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c21b573-c73e-4d3c-bac7-da1338a4aa40-tmp\") pod \"kube-auth-proxy-54c65669-xrgmc\" (UID: \"3c21b573-c73e-4d3c-bac7-da1338a4aa40\") " pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.633436 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.633408 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3c21b573-c73e-4d3c-bac7-da1338a4aa40-tmp\") pod \"kube-auth-proxy-54c65669-xrgmc\" (UID: \"3c21b573-c73e-4d3c-bac7-da1338a4aa40\") " pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.633619 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.633603 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c21b573-c73e-4d3c-bac7-da1338a4aa40-tls-certs\") pod \"kube-auth-proxy-54c65669-xrgmc\" (UID: \"3c21b573-c73e-4d3c-bac7-da1338a4aa40\") " pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.642108 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.642081 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrqdw\" (UniqueName: \"kubernetes.io/projected/3c21b573-c73e-4d3c-bac7-da1338a4aa40-kube-api-access-mrqdw\") pod \"kube-auth-proxy-54c65669-xrgmc\" (UID: \"3c21b573-c73e-4d3c-bac7-da1338a4aa40\") " pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.805110 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.805071 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" Apr 20 13:37:39.949026 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:39.948978 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-54c65669-xrgmc"] Apr 20 13:37:40.387028 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:37:40.386995 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c21b573_c73e_4d3c_bac7_da1338a4aa40.slice/crio-a49b10d5dbdd68ec0620769b4de3289476c167d06e63aaf91f9bebae5bd376d4 WatchSource:0}: Error finding container a49b10d5dbdd68ec0620769b4de3289476c167d06e63aaf91f9bebae5bd376d4: Status 404 returned error can't find the container with id a49b10d5dbdd68ec0620769b4de3289476c167d06e63aaf91f9bebae5bd376d4 Apr 20 13:37:41.385199 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:41.385160 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" event={"ID":"3c21b573-c73e-4d3c-bac7-da1338a4aa40","Type":"ContainerStarted","Data":"a49b10d5dbdd68ec0620769b4de3289476c167d06e63aaf91f9bebae5bd376d4"} Apr 20 13:37:41.388327 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:41.387078 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" event={"ID":"12f63628-da5a-4c24-b955-45c6549f5e08","Type":"ContainerStarted","Data":"9dd9f2c592b2a4993e7ab31d2f09d6a03fd57540364107f1ebaec0a26845bfde"} Apr 20 13:37:41.388327 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:41.387832 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:37:41.404382 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:41.404317 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" podStartSLOduration=3.032960714 podStartE2EDuration="5.404297972s" podCreationTimestamp="2026-04-20 13:37:36 +0000 UTC" firstStartedPulling="2026-04-20 13:37:38.045107574 +0000 UTC m=+413.608798825" lastFinishedPulling="2026-04-20 13:37:40.416444827 +0000 UTC m=+415.980136083" observedRunningTime="2026-04-20 13:37:41.403316594 +0000 UTC m=+416.967007868" watchObservedRunningTime="2026-04-20 13:37:41.404297972 +0000 UTC m=+416.967989243" Apr 20 13:37:43.394767 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:43.394732 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" event={"ID":"3c21b573-c73e-4d3c-bac7-da1338a4aa40","Type":"ContainerStarted","Data":"0b7c253f760afe8120930a990c12920d2ca5c3e1713ace5bc2a51eed27281b41"} Apr 20 13:37:43.412604 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:43.412499 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-54c65669-xrgmc" podStartSLOduration=1.641633795 podStartE2EDuration="4.412484626s" podCreationTimestamp="2026-04-20 13:37:39 +0000 UTC" firstStartedPulling="2026-04-20 13:37:40.388743733 +0000 UTC m=+415.952434984" lastFinishedPulling="2026-04-20 13:37:43.159594556 +0000 UTC m=+418.723285815" observedRunningTime="2026-04-20 13:37:43.410669167 +0000 UTC m=+418.974360437" watchObservedRunningTime="2026-04-20 13:37:43.412484626 +0000 UTC m=+418.976175954" Apr 20 13:37:48.376215 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:48.376180 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-scs84" Apr 20 13:37:54.824612 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:54.824576 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9q822"] Apr 20 13:37:54.828443 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:54.828417 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" Apr 20 13:37:54.832042 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:54.832014 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 13:37:54.832042 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:54.832021 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 13:37:54.832252 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:54.832140 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-q5qvk\"" Apr 20 13:37:54.838745 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:54.838720 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9q822"] Apr 20 13:37:54.960433 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:54.960392 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4944p\" (UniqueName: \"kubernetes.io/projected/a8fcd189-2df3-4722-9684-a1a03d63bbc9-kube-api-access-4944p\") pod \"servicemesh-operator3-55f49c5f94-9q822\" (UID: \"a8fcd189-2df3-4722-9684-a1a03d63bbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" Apr 20 13:37:54.960433 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:54.960433 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a8fcd189-2df3-4722-9684-a1a03d63bbc9-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9q822\" (UID: \"a8fcd189-2df3-4722-9684-a1a03d63bbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" Apr 20 13:37:55.061686 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:55.061648 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4944p\" (UniqueName: \"kubernetes.io/projected/a8fcd189-2df3-4722-9684-a1a03d63bbc9-kube-api-access-4944p\") pod \"servicemesh-operator3-55f49c5f94-9q822\" (UID: \"a8fcd189-2df3-4722-9684-a1a03d63bbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" Apr 20 13:37:55.061686 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:55.061687 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a8fcd189-2df3-4722-9684-a1a03d63bbc9-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9q822\" (UID: \"a8fcd189-2df3-4722-9684-a1a03d63bbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" Apr 20 13:37:55.064875 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:55.064848 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a8fcd189-2df3-4722-9684-a1a03d63bbc9-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9q822\" (UID: \"a8fcd189-2df3-4722-9684-a1a03d63bbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" Apr 20 13:37:55.073334 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:55.073308 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4944p\" (UniqueName: \"kubernetes.io/projected/a8fcd189-2df3-4722-9684-a1a03d63bbc9-kube-api-access-4944p\") pod \"servicemesh-operator3-55f49c5f94-9q822\" (UID: \"a8fcd189-2df3-4722-9684-a1a03d63bbc9\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" Apr 20 13:37:55.137867 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:55.137785 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" Apr 20 13:37:55.280431 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:55.280287 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9q822"] Apr 20 13:37:55.283340 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:37:55.283312 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8fcd189_2df3_4722_9684_a1a03d63bbc9.slice/crio-a34eccfa85cca86febb4256e786eaf6a09bd15cebea0fbf7faa16728939eb66b WatchSource:0}: Error finding container a34eccfa85cca86febb4256e786eaf6a09bd15cebea0fbf7faa16728939eb66b: Status 404 returned error can't find the container with id a34eccfa85cca86febb4256e786eaf6a09bd15cebea0fbf7faa16728939eb66b Apr 20 13:37:55.435038 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:55.434946 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" event={"ID":"a8fcd189-2df3-4722-9684-a1a03d63bbc9","Type":"ContainerStarted","Data":"a34eccfa85cca86febb4256e786eaf6a09bd15cebea0fbf7faa16728939eb66b"} Apr 20 13:37:58.446449 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:58.446413 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" event={"ID":"a8fcd189-2df3-4722-9684-a1a03d63bbc9","Type":"ContainerStarted","Data":"2f11d4f10eb04e2778a4bb17d561507a351864ab634e6a6f9f31188cd5c710dd"} Apr 20 13:37:58.446841 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:58.446626 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" Apr 20 13:37:58.467291 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:37:58.467237 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" podStartSLOduration=1.788655994 podStartE2EDuration="4.467222855s" podCreationTimestamp="2026-04-20 13:37:54 +0000 UTC" firstStartedPulling="2026-04-20 13:37:55.288324653 +0000 UTC m=+430.852015916" lastFinishedPulling="2026-04-20 13:37:57.966891529 +0000 UTC m=+433.530582777" observedRunningTime="2026-04-20 13:37:58.464983217 +0000 UTC m=+434.028674486" watchObservedRunningTime="2026-04-20 13:37:58.467222855 +0000 UTC m=+434.030914125" Apr 20 13:38:09.451773 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:09.451743 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9q822" Apr 20 13:38:10.998366 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:10.998328 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6"] Apr 20 13:38:11.054040 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.054006 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6"] Apr 20 13:38:11.054219 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.054163 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.056637 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.056607 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-4vhqb\"" Apr 20 13:38:11.056637 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.056627 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 13:38:11.056637 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.056605 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 13:38:11.056858 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.056700 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 13:38:11.057333 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.057316 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 13:38:11.197143 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.197105 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.197143 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.197145 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.197354 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.197168 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.197354 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.197258 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mt2g\" (UniqueName: \"kubernetes.io/projected/02780b87-93fa-4e8b-8727-f2b1580ff6ee-kube-api-access-2mt2g\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.197354 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.197303 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/02780b87-93fa-4e8b-8727-f2b1580ff6ee-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.197449 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.197365 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/02780b87-93fa-4e8b-8727-f2b1580ff6ee-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.197449 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.197398 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.298557 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.298519 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/02780b87-93fa-4e8b-8727-f2b1580ff6ee-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.298557 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.298566 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/02780b87-93fa-4e8b-8727-f2b1580ff6ee-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.298809 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.298584 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.298809 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.298622 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.298809 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.298640 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.298809 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.298656 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.298809 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.298707 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mt2g\" (UniqueName: \"kubernetes.io/projected/02780b87-93fa-4e8b-8727-f2b1580ff6ee-kube-api-access-2mt2g\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.299462 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.299431 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.300981 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.300957 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/02780b87-93fa-4e8b-8727-f2b1580ff6ee-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.301649 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.301625 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.301850 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.301834 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/02780b87-93fa-4e8b-8727-f2b1580ff6ee-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.301919 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.301887 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.306910 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.306886 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/02780b87-93fa-4e8b-8727-f2b1580ff6ee-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.307182 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.307163 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mt2g\" (UniqueName: \"kubernetes.io/projected/02780b87-93fa-4e8b-8727-f2b1580ff6ee-kube-api-access-2mt2g\") pod \"istiod-openshift-gateway-55ff986f96-t28m6\" (UID: \"02780b87-93fa-4e8b-8727-f2b1580ff6ee\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.364192 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.364153 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:11.500775 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:11.500742 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6"] Apr 20 13:38:11.503494 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:38:11.503467 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02780b87_93fa_4e8b_8727_f2b1580ff6ee.slice/crio-1d028dae16bdbe069efab7f6527e16497600c3a536297c15320fd3c61419768a WatchSource:0}: Error finding container 1d028dae16bdbe069efab7f6527e16497600c3a536297c15320fd3c61419768a: Status 404 returned error can't find the container with id 1d028dae16bdbe069efab7f6527e16497600c3a536297c15320fd3c61419768a Apr 20 13:38:12.496514 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:12.496474 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" event={"ID":"02780b87-93fa-4e8b-8727-f2b1580ff6ee","Type":"ContainerStarted","Data":"1d028dae16bdbe069efab7f6527e16497600c3a536297c15320fd3c61419768a"} Apr 20 13:38:13.400894 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:13.400859 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-46dfm" Apr 20 13:38:14.851881 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:14.851844 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 13:38:14.852213 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:14.851913 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 13:38:15.507995 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:15.507957 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" event={"ID":"02780b87-93fa-4e8b-8727-f2b1580ff6ee","Type":"ContainerStarted","Data":"990fe4e3a97d0fda8df4ab94ab20d764a8d2b7be4230a3e62c22b8a3574377dd"} Apr 20 13:38:15.508206 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:15.508132 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:15.509875 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:15.509839 2563 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-t28m6 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 13:38:15.509994 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:15.509893 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" podUID="02780b87-93fa-4e8b-8727-f2b1580ff6ee" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 13:38:15.528857 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:15.528799 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" podStartSLOduration=2.182745446 podStartE2EDuration="5.528783919s" podCreationTimestamp="2026-04-20 13:38:10 +0000 UTC" firstStartedPulling="2026-04-20 13:38:11.505572732 +0000 UTC m=+447.069263997" lastFinishedPulling="2026-04-20 13:38:14.851611222 +0000 UTC m=+450.415302470" observedRunningTime="2026-04-20 13:38:15.527203036 +0000 UTC m=+451.090894306" watchObservedRunningTime="2026-04-20 13:38:15.528783919 +0000 UTC m=+451.092475190" Apr 20 13:38:16.511452 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:16.511421 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-t28m6" Apr 20 13:38:58.850285 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:58.850250 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-pmpw4"] Apr 20 13:38:58.853242 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:58.853219 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-pmpw4" Apr 20 13:38:58.855658 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:58.855631 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 13:38:58.856510 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:58.856488 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 13:38:58.856623 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:58.856553 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-v6pxn\"" Apr 20 13:38:58.864817 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:58.864788 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-pmpw4"] Apr 20 13:38:58.998280 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:58.998236 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5h6q\" (UniqueName: \"kubernetes.io/projected/f9d2b9e6-4f0b-4645-9b13-4f7e53c18c72-kube-api-access-h5h6q\") pod \"authorino-operator-657f44b778-pmpw4\" (UID: \"f9d2b9e6-4f0b-4645-9b13-4f7e53c18c72\") " pod="kuadrant-system/authorino-operator-657f44b778-pmpw4" Apr 20 13:38:59.099023 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:59.098985 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5h6q\" (UniqueName: \"kubernetes.io/projected/f9d2b9e6-4f0b-4645-9b13-4f7e53c18c72-kube-api-access-h5h6q\") pod \"authorino-operator-657f44b778-pmpw4\" (UID: \"f9d2b9e6-4f0b-4645-9b13-4f7e53c18c72\") " pod="kuadrant-system/authorino-operator-657f44b778-pmpw4" Apr 20 13:38:59.107430 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:59.107357 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5h6q\" (UniqueName: \"kubernetes.io/projected/f9d2b9e6-4f0b-4645-9b13-4f7e53c18c72-kube-api-access-h5h6q\") pod \"authorino-operator-657f44b778-pmpw4\" (UID: \"f9d2b9e6-4f0b-4645-9b13-4f7e53c18c72\") " pod="kuadrant-system/authorino-operator-657f44b778-pmpw4" Apr 20 13:38:59.164547 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:59.164510 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-pmpw4" Apr 20 13:38:59.307194 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:59.307169 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-pmpw4"] Apr 20 13:38:59.309844 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:38:59.309812 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9d2b9e6_4f0b_4645_9b13_4f7e53c18c72.slice/crio-e7c8f24cb6b5da733b1d33703912ae6355f4e46fdf4d6e459569d5f516113823 WatchSource:0}: Error finding container e7c8f24cb6b5da733b1d33703912ae6355f4e46fdf4d6e459569d5f516113823: Status 404 returned error can't find the container with id e7c8f24cb6b5da733b1d33703912ae6355f4e46fdf4d6e459569d5f516113823 Apr 20 13:38:59.650892 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:38:59.650854 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-pmpw4" event={"ID":"f9d2b9e6-4f0b-4645-9b13-4f7e53c18c72","Type":"ContainerStarted","Data":"e7c8f24cb6b5da733b1d33703912ae6355f4e46fdf4d6e459569d5f516113823"} Apr 20 13:39:01.659313 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:01.659265 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-pmpw4" event={"ID":"f9d2b9e6-4f0b-4645-9b13-4f7e53c18c72","Type":"ContainerStarted","Data":"5cbd64713be90efcc8fe68e11313beaa069062e082251bd3e06adadbdea6e7ee"} Apr 20 13:39:01.659711 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:01.659425 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-pmpw4" Apr 20 13:39:01.690141 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:01.690085 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-pmpw4" podStartSLOduration=2.1469244610000002 podStartE2EDuration="3.690071264s" podCreationTimestamp="2026-04-20 13:38:58 +0000 UTC" firstStartedPulling="2026-04-20 13:38:59.31175919 +0000 UTC m=+494.875450438" lastFinishedPulling="2026-04-20 13:39:00.854905985 +0000 UTC m=+496.418597241" observedRunningTime="2026-04-20 13:39:01.689828624 +0000 UTC m=+497.253519894" watchObservedRunningTime="2026-04-20 13:39:01.690071264 +0000 UTC m=+497.253762531" Apr 20 13:39:12.665087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:12.665036 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-pmpw4" Apr 20 13:39:52.176987 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.176949 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv"] Apr 20 13:39:52.181637 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.181617 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.184250 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.184223 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-qxb8f\"" Apr 20 13:39:52.192323 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.192298 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv"] Apr 20 13:39:52.237408 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.237378 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.237590 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.237416 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.237590 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.237497 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.237590 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.237549 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.237760 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.237639 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.237760 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.237674 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.237760 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.237719 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.237760 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.237750 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrkjg\" (UniqueName: \"kubernetes.io/projected/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-kube-api-access-qrkjg\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.237925 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.237780 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.339211 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.339172 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrkjg\" (UniqueName: \"kubernetes.io/projected/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-kube-api-access-qrkjg\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.339418 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.339280 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.339418 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.339363 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.339418 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.339384 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.339589 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.339567 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.339650 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.339613 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.339707 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.339686 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.339760 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.339716 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.339814 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.339758 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.340250 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.339980 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.340250 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.340129 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.340250 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.340206 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.340480 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.340337 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.340480 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.340445 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.342364 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.342343 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.342690 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.342665 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.349467 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.349440 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.349888 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.349853 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrkjg\" (UniqueName: \"kubernetes.io/projected/774a70ec-4f0b-4ad0-b0fa-7bae4bf52465-kube-api-access-qrkjg\") pod \"maas-default-gateway-openshift-default-58b6f876-nwrxv\" (UID: \"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.497791 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.497701 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:52.638171 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.638137 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv"] Apr 20 13:39:52.641389 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:39:52.641357 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod774a70ec_4f0b_4ad0_b0fa_7bae4bf52465.slice/crio-6203e42f73e2beb8cac26134244e1e656e3f1c041f1c2616c8cb15463736f4ec WatchSource:0}: Error finding container 6203e42f73e2beb8cac26134244e1e656e3f1c041f1c2616c8cb15463736f4ec: Status 404 returned error can't find the container with id 6203e42f73e2beb8cac26134244e1e656e3f1c041f1c2616c8cb15463736f4ec Apr 20 13:39:52.831956 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:52.831920 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" event={"ID":"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465","Type":"ContainerStarted","Data":"6203e42f73e2beb8cac26134244e1e656e3f1c041f1c2616c8cb15463736f4ec"} Apr 20 13:39:55.042749 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:55.042699 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 13:39:55.043072 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:55.042776 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 13:39:55.043072 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:55.042801 2563 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 13:39:55.844945 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:55.844904 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" event={"ID":"774a70ec-4f0b-4ad0-b0fa-7bae4bf52465","Type":"ContainerStarted","Data":"b5f6e4ff278ff92d8dd27a773a98e7bf4951da715de0781b3b992ed46b212767"} Apr 20 13:39:55.867010 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:55.866955 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" podStartSLOduration=1.4677900529999999 podStartE2EDuration="3.866939144s" podCreationTimestamp="2026-04-20 13:39:52 +0000 UTC" firstStartedPulling="2026-04-20 13:39:52.643257138 +0000 UTC m=+548.206948390" lastFinishedPulling="2026-04-20 13:39:55.04240623 +0000 UTC m=+550.606097481" observedRunningTime="2026-04-20 13:39:55.865444514 +0000 UTC m=+551.429135797" watchObservedRunningTime="2026-04-20 13:39:55.866939144 +0000 UTC m=+551.430630414" Apr 20 13:39:56.497467 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.497388 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mkwtx"] Apr 20 13:39:56.500800 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.500776 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:56.500917 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.500883 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:39:56.503496 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.503471 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-bbcr5\"" Apr 20 13:39:56.503626 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.503499 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 13:39:56.503989 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.503972 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:56.509417 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.509397 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mkwtx"] Apr 20 13:39:56.580521 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.580489 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1142cf5c-a071-4c41-a248-015d6b34d58d-config-file\") pod \"limitador-limitador-7d549b5b-mkwtx\" (UID: \"1142cf5c-a071-4c41-a248-015d6b34d58d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:39:56.580677 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.580529 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdt9b\" (UniqueName: \"kubernetes.io/projected/1142cf5c-a071-4c41-a248-015d6b34d58d-kube-api-access-qdt9b\") pod \"limitador-limitador-7d549b5b-mkwtx\" (UID: \"1142cf5c-a071-4c41-a248-015d6b34d58d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:39:56.596678 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.596641 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mkwtx"] Apr 20 13:39:56.681720 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.681679 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1142cf5c-a071-4c41-a248-015d6b34d58d-config-file\") pod \"limitador-limitador-7d549b5b-mkwtx\" (UID: \"1142cf5c-a071-4c41-a248-015d6b34d58d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:39:56.681720 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.681722 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdt9b\" (UniqueName: \"kubernetes.io/projected/1142cf5c-a071-4c41-a248-015d6b34d58d-kube-api-access-qdt9b\") pod \"limitador-limitador-7d549b5b-mkwtx\" (UID: \"1142cf5c-a071-4c41-a248-015d6b34d58d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:39:56.682438 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.682417 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1142cf5c-a071-4c41-a248-015d6b34d58d-config-file\") pod \"limitador-limitador-7d549b5b-mkwtx\" (UID: \"1142cf5c-a071-4c41-a248-015d6b34d58d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:39:56.690933 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.690900 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdt9b\" (UniqueName: \"kubernetes.io/projected/1142cf5c-a071-4c41-a248-015d6b34d58d-kube-api-access-qdt9b\") pod \"limitador-limitador-7d549b5b-mkwtx\" (UID: \"1142cf5c-a071-4c41-a248-015d6b34d58d\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:39:56.811977 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.811935 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:39:56.848940 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.848891 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:56.850161 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.850138 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-nwrxv" Apr 20 13:39:56.959490 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:56.959457 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mkwtx"] Apr 20 13:39:56.961572 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:39:56.961546 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1142cf5c_a071_4c41_a248_015d6b34d58d.slice/crio-00f69079fcbece174224ec246e193005da7c942f788894d19dbef98820c04a39 WatchSource:0}: Error finding container 00f69079fcbece174224ec246e193005da7c942f788894d19dbef98820c04a39: Status 404 returned error can't find the container with id 00f69079fcbece174224ec246e193005da7c942f788894d19dbef98820c04a39 Apr 20 13:39:57.855947 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:57.855898 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" event={"ID":"1142cf5c-a071-4c41-a248-015d6b34d58d","Type":"ContainerStarted","Data":"00f69079fcbece174224ec246e193005da7c942f788894d19dbef98820c04a39"} Apr 20 13:39:59.863665 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:59.863633 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" event={"ID":"1142cf5c-a071-4c41-a248-015d6b34d58d","Type":"ContainerStarted","Data":"85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed"} Apr 20 13:39:59.864096 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:59.863732 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:39:59.881140 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:39:59.881081 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" podStartSLOduration=1.155132554 podStartE2EDuration="3.881042586s" podCreationTimestamp="2026-04-20 13:39:56 +0000 UTC" firstStartedPulling="2026-04-20 13:39:56.963969437 +0000 UTC m=+552.527660689" lastFinishedPulling="2026-04-20 13:39:59.689879473 +0000 UTC m=+555.253570721" observedRunningTime="2026-04-20 13:39:59.880301853 +0000 UTC m=+555.443993124" watchObservedRunningTime="2026-04-20 13:39:59.881042586 +0000 UTC m=+555.444733855" Apr 20 13:40:10.868805 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:10.868768 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:40:12.057633 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.057592 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mkwtx"] Apr 20 13:40:12.058093 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.057815 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" podUID="1142cf5c-a071-4c41-a248-015d6b34d58d" containerName="limitador" containerID="cri-o://85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed" gracePeriod=30 Apr 20 13:40:12.597232 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.597210 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:40:12.733581 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.733497 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1142cf5c-a071-4c41-a248-015d6b34d58d-config-file\") pod \"1142cf5c-a071-4c41-a248-015d6b34d58d\" (UID: \"1142cf5c-a071-4c41-a248-015d6b34d58d\") " Apr 20 13:40:12.733581 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.733534 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdt9b\" (UniqueName: \"kubernetes.io/projected/1142cf5c-a071-4c41-a248-015d6b34d58d-kube-api-access-qdt9b\") pod \"1142cf5c-a071-4c41-a248-015d6b34d58d\" (UID: \"1142cf5c-a071-4c41-a248-015d6b34d58d\") " Apr 20 13:40:12.733861 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.733838 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1142cf5c-a071-4c41-a248-015d6b34d58d-config-file" (OuterVolumeSpecName: "config-file") pod "1142cf5c-a071-4c41-a248-015d6b34d58d" (UID: "1142cf5c-a071-4c41-a248-015d6b34d58d"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 13:40:12.735892 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.735861 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1142cf5c-a071-4c41-a248-015d6b34d58d-kube-api-access-qdt9b" (OuterVolumeSpecName: "kube-api-access-qdt9b") pod "1142cf5c-a071-4c41-a248-015d6b34d58d" (UID: "1142cf5c-a071-4c41-a248-015d6b34d58d"). InnerVolumeSpecName "kube-api-access-qdt9b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:40:12.834958 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.834924 2563 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1142cf5c-a071-4c41-a248-015d6b34d58d-config-file\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:40:12.834958 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.834952 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qdt9b\" (UniqueName: \"kubernetes.io/projected/1142cf5c-a071-4c41-a248-015d6b34d58d-kube-api-access-qdt9b\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:40:12.905977 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.905939 2563 generic.go:358] "Generic (PLEG): container finished" podID="1142cf5c-a071-4c41-a248-015d6b34d58d" containerID="85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed" exitCode=0 Apr 20 13:40:12.906140 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.906004 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" Apr 20 13:40:12.906140 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.906013 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" event={"ID":"1142cf5c-a071-4c41-a248-015d6b34d58d","Type":"ContainerDied","Data":"85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed"} Apr 20 13:40:12.906140 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.906071 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mkwtx" event={"ID":"1142cf5c-a071-4c41-a248-015d6b34d58d","Type":"ContainerDied","Data":"00f69079fcbece174224ec246e193005da7c942f788894d19dbef98820c04a39"} Apr 20 13:40:12.906140 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.906090 2563 scope.go:117] "RemoveContainer" containerID="85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed" Apr 20 13:40:12.914600 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.914580 2563 scope.go:117] "RemoveContainer" containerID="85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed" Apr 20 13:40:12.914855 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:40:12.914834 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed\": container with ID starting with 85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed not found: ID does not exist" containerID="85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed" Apr 20 13:40:12.914905 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.914865 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed"} err="failed to get container status \"85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed\": rpc error: code = NotFound desc = could not find container \"85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed\": container with ID starting with 85e5860719af3f00035d01e8720d40460483cd89ec3f680c564f40ac8cf656ed not found: ID does not exist" Apr 20 13:40:12.929125 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.929087 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mkwtx"] Apr 20 13:40:12.936657 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:12.936627 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mkwtx"] Apr 20 13:40:13.066804 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:13.066770 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1142cf5c-a071-4c41-a248-015d6b34d58d" path="/var/lib/kubelet/pods/1142cf5c-a071-4c41-a248-015d6b34d58d/volumes" Apr 20 13:40:17.672091 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.672037 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-kxs52"] Apr 20 13:40:17.672470 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.672405 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1142cf5c-a071-4c41-a248-015d6b34d58d" containerName="limitador" Apr 20 13:40:17.672470 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.672418 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="1142cf5c-a071-4c41-a248-015d6b34d58d" containerName="limitador" Apr 20 13:40:17.672539 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.672504 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="1142cf5c-a071-4c41-a248-015d6b34d58d" containerName="limitador" Apr 20 13:40:17.676841 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.676819 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-kxs52" Apr 20 13:40:17.679315 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.679291 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 13:40:17.679464 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.679342 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-qfwkl\"" Apr 20 13:40:17.685159 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.684932 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-kxs52"] Apr 20 13:40:17.775968 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.775932 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k75z\" (UniqueName: \"kubernetes.io/projected/f5e08680-837e-42b3-906f-4546cdc8ff8f-kube-api-access-9k75z\") pod \"postgres-868db5846d-kxs52\" (UID: \"f5e08680-837e-42b3-906f-4546cdc8ff8f\") " pod="opendatahub/postgres-868db5846d-kxs52" Apr 20 13:40:17.776165 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.775978 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f5e08680-837e-42b3-906f-4546cdc8ff8f-data\") pod \"postgres-868db5846d-kxs52\" (UID: \"f5e08680-837e-42b3-906f-4546cdc8ff8f\") " pod="opendatahub/postgres-868db5846d-kxs52" Apr 20 13:40:17.876781 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.876729 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9k75z\" (UniqueName: \"kubernetes.io/projected/f5e08680-837e-42b3-906f-4546cdc8ff8f-kube-api-access-9k75z\") pod \"postgres-868db5846d-kxs52\" (UID: \"f5e08680-837e-42b3-906f-4546cdc8ff8f\") " pod="opendatahub/postgres-868db5846d-kxs52" Apr 20 13:40:17.876781 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.876783 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f5e08680-837e-42b3-906f-4546cdc8ff8f-data\") pod \"postgres-868db5846d-kxs52\" (UID: \"f5e08680-837e-42b3-906f-4546cdc8ff8f\") " pod="opendatahub/postgres-868db5846d-kxs52" Apr 20 13:40:17.877208 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.877191 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f5e08680-837e-42b3-906f-4546cdc8ff8f-data\") pod \"postgres-868db5846d-kxs52\" (UID: \"f5e08680-837e-42b3-906f-4546cdc8ff8f\") " pod="opendatahub/postgres-868db5846d-kxs52" Apr 20 13:40:17.885772 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.885739 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k75z\" (UniqueName: \"kubernetes.io/projected/f5e08680-837e-42b3-906f-4546cdc8ff8f-kube-api-access-9k75z\") pod \"postgres-868db5846d-kxs52\" (UID: \"f5e08680-837e-42b3-906f-4546cdc8ff8f\") " pod="opendatahub/postgres-868db5846d-kxs52" Apr 20 13:40:17.989180 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:17.989085 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-kxs52" Apr 20 13:40:18.116979 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:18.116804 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-kxs52"] Apr 20 13:40:18.119757 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:40:18.119724 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5e08680_837e_42b3_906f_4546cdc8ff8f.slice/crio-2807de7afd9a212e9b110083595cf729ea87b2736c58184508fcd257ea364925 WatchSource:0}: Error finding container 2807de7afd9a212e9b110083595cf729ea87b2736c58184508fcd257ea364925: Status 404 returned error can't find the container with id 2807de7afd9a212e9b110083595cf729ea87b2736c58184508fcd257ea364925 Apr 20 13:40:18.926942 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:18.926907 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-kxs52" event={"ID":"f5e08680-837e-42b3-906f-4546cdc8ff8f","Type":"ContainerStarted","Data":"2807de7afd9a212e9b110083595cf729ea87b2736c58184508fcd257ea364925"} Apr 20 13:40:23.946160 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:23.946117 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-kxs52" event={"ID":"f5e08680-837e-42b3-906f-4546cdc8ff8f","Type":"ContainerStarted","Data":"668f8d1d3bfd9f9897f6828f1653d37cfb9c610f9da3853f188fd2ca732a3a6c"} Apr 20 13:40:23.946735 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:23.946258 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-kxs52" Apr 20 13:40:23.970124 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:23.970075 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-kxs52" podStartSLOduration=1.730840977 podStartE2EDuration="6.970032326s" podCreationTimestamp="2026-04-20 13:40:17 +0000 UTC" firstStartedPulling="2026-04-20 13:40:18.121170883 +0000 UTC m=+573.684862135" lastFinishedPulling="2026-04-20 13:40:23.360362225 +0000 UTC m=+578.924053484" observedRunningTime="2026-04-20 13:40:23.965066051 +0000 UTC m=+579.528757321" watchObservedRunningTime="2026-04-20 13:40:23.970032326 +0000 UTC m=+579.533723596" Apr 20 13:40:29.978087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:29.978029 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-kxs52" Apr 20 13:40:33.271488 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:33.271451 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5df788645d-ctftr"] Apr 20 13:40:33.283258 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:33.283232 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5df788645d-ctftr"] Apr 20 13:40:33.283427 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:33.283350 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5df788645d-ctftr" Apr 20 13:40:33.285946 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:33.285922 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-d7t72\"" Apr 20 13:40:33.420530 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:33.420497 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6jjh\" (UniqueName: \"kubernetes.io/projected/d2aaef74-ae47-43f1-8c74-894d405e436c-kube-api-access-z6jjh\") pod \"maas-controller-5df788645d-ctftr\" (UID: \"d2aaef74-ae47-43f1-8c74-894d405e436c\") " pod="opendatahub/maas-controller-5df788645d-ctftr" Apr 20 13:40:33.521704 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:33.521600 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6jjh\" (UniqueName: \"kubernetes.io/projected/d2aaef74-ae47-43f1-8c74-894d405e436c-kube-api-access-z6jjh\") pod \"maas-controller-5df788645d-ctftr\" (UID: \"d2aaef74-ae47-43f1-8c74-894d405e436c\") " pod="opendatahub/maas-controller-5df788645d-ctftr" Apr 20 13:40:33.531378 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:33.531354 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6jjh\" (UniqueName: \"kubernetes.io/projected/d2aaef74-ae47-43f1-8c74-894d405e436c-kube-api-access-z6jjh\") pod \"maas-controller-5df788645d-ctftr\" (UID: \"d2aaef74-ae47-43f1-8c74-894d405e436c\") " pod="opendatahub/maas-controller-5df788645d-ctftr" Apr 20 13:40:33.599910 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:33.599862 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5df788645d-ctftr" Apr 20 13:40:33.726537 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:33.726512 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5df788645d-ctftr"] Apr 20 13:40:33.729110 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:40:33.729081 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2aaef74_ae47_43f1_8c74_894d405e436c.slice/crio-513e33bcc1ea884104b7179de29f7589ce66b3f9ac4b2754685e77319a7fcc87 WatchSource:0}: Error finding container 513e33bcc1ea884104b7179de29f7589ce66b3f9ac4b2754685e77319a7fcc87: Status 404 returned error can't find the container with id 513e33bcc1ea884104b7179de29f7589ce66b3f9ac4b2754685e77319a7fcc87 Apr 20 13:40:33.979208 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:33.979165 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5df788645d-ctftr" event={"ID":"d2aaef74-ae47-43f1-8c74-894d405e436c","Type":"ContainerStarted","Data":"513e33bcc1ea884104b7179de29f7589ce66b3f9ac4b2754685e77319a7fcc87"} Apr 20 13:40:36.989788 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:36.989750 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5df788645d-ctftr" event={"ID":"d2aaef74-ae47-43f1-8c74-894d405e436c","Type":"ContainerStarted","Data":"33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859"} Apr 20 13:40:36.990239 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:36.989867 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5df788645d-ctftr" Apr 20 13:40:37.007616 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:37.007560 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5df788645d-ctftr" podStartSLOduration=1.7606857470000001 podStartE2EDuration="4.007542421s" podCreationTimestamp="2026-04-20 13:40:33 +0000 UTC" firstStartedPulling="2026-04-20 13:40:33.730441974 +0000 UTC m=+589.294133221" lastFinishedPulling="2026-04-20 13:40:35.977298644 +0000 UTC m=+591.540989895" observedRunningTime="2026-04-20 13:40:37.006123527 +0000 UTC m=+592.569814797" watchObservedRunningTime="2026-04-20 13:40:37.007542421 +0000 UTC m=+592.571233690" Apr 20 13:40:38.154107 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.154069 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-667696b866-qpj9f"] Apr 20 13:40:38.157575 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.157556 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:40:38.160812 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.160789 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 13:40:38.160947 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.160795 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 13:40:38.161170 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.161154 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-hn6q8\"" Apr 20 13:40:38.172391 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.172367 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-667696b866-qpj9f"] Apr 20 13:40:38.269285 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.269252 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbz9z\" (UniqueName: \"kubernetes.io/projected/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-kube-api-access-nbz9z\") pod \"maas-api-667696b866-qpj9f\" (UID: \"2f414fbc-fa7c-4383-9b73-84ed5b26ea30\") " pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:40:38.269465 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.269294 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-maas-api-tls\") pod \"maas-api-667696b866-qpj9f\" (UID: \"2f414fbc-fa7c-4383-9b73-84ed5b26ea30\") " pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:40:38.370611 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.370570 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbz9z\" (UniqueName: \"kubernetes.io/projected/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-kube-api-access-nbz9z\") pod \"maas-api-667696b866-qpj9f\" (UID: \"2f414fbc-fa7c-4383-9b73-84ed5b26ea30\") " pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:40:38.370611 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.370616 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-maas-api-tls\") pod \"maas-api-667696b866-qpj9f\" (UID: \"2f414fbc-fa7c-4383-9b73-84ed5b26ea30\") " pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:40:38.373307 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.373276 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-maas-api-tls\") pod \"maas-api-667696b866-qpj9f\" (UID: \"2f414fbc-fa7c-4383-9b73-84ed5b26ea30\") " pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:40:38.380735 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.380695 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbz9z\" (UniqueName: \"kubernetes.io/projected/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-kube-api-access-nbz9z\") pod \"maas-api-667696b866-qpj9f\" (UID: \"2f414fbc-fa7c-4383-9b73-84ed5b26ea30\") " pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:40:38.468543 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.468450 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:40:38.600664 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.600632 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-667696b866-qpj9f"] Apr 20 13:40:38.603391 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:40:38.603362 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f414fbc_fa7c_4383_9b73_84ed5b26ea30.slice/crio-5845ef6d5b82404c5a62ab1e6e265dd4e953072c785d35e4e12811b250f3c880 WatchSource:0}: Error finding container 5845ef6d5b82404c5a62ab1e6e265dd4e953072c785d35e4e12811b250f3c880: Status 404 returned error can't find the container with id 5845ef6d5b82404c5a62ab1e6e265dd4e953072c785d35e4e12811b250f3c880 Apr 20 13:40:38.998818 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:38.998764 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-667696b866-qpj9f" event={"ID":"2f414fbc-fa7c-4383-9b73-84ed5b26ea30","Type":"ContainerStarted","Data":"5845ef6d5b82404c5a62ab1e6e265dd4e953072c785d35e4e12811b250f3c880"} Apr 20 13:40:42.033300 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:42.033264 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-667696b866-qpj9f" event={"ID":"2f414fbc-fa7c-4383-9b73-84ed5b26ea30","Type":"ContainerStarted","Data":"cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c"} Apr 20 13:40:42.033708 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:42.033382 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:40:42.052398 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:42.052344 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-667696b866-qpj9f" podStartSLOduration=1.58341682 podStartE2EDuration="4.052326461s" podCreationTimestamp="2026-04-20 13:40:38 +0000 UTC" firstStartedPulling="2026-04-20 13:40:38.605132215 +0000 UTC m=+594.168823466" lastFinishedPulling="2026-04-20 13:40:41.074041845 +0000 UTC m=+596.637733107" observedRunningTime="2026-04-20 13:40:42.051302372 +0000 UTC m=+597.614993642" watchObservedRunningTime="2026-04-20 13:40:42.052326461 +0000 UTC m=+597.616017732" Apr 20 13:40:44.978941 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:44.978911 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:40:44.979411 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:44.979359 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:40:47.999516 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:47.999479 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5df788645d-ctftr" Apr 20 13:40:48.042735 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:48.042704 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:40:48.354783 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:48.354745 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-67b98965d5-xn4tt"] Apr 20 13:40:48.360371 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:48.360347 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67b98965d5-xn4tt" Apr 20 13:40:48.368003 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:48.367979 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-67b98965d5-xn4tt"] Apr 20 13:40:48.459821 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:48.459781 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9c8n\" (UniqueName: \"kubernetes.io/projected/384fb511-4fec-40ec-bcac-eac772c3053d-kube-api-access-c9c8n\") pod \"maas-controller-67b98965d5-xn4tt\" (UID: \"384fb511-4fec-40ec-bcac-eac772c3053d\") " pod="opendatahub/maas-controller-67b98965d5-xn4tt" Apr 20 13:40:48.560986 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:48.560948 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9c8n\" (UniqueName: \"kubernetes.io/projected/384fb511-4fec-40ec-bcac-eac772c3053d-kube-api-access-c9c8n\") pod \"maas-controller-67b98965d5-xn4tt\" (UID: \"384fb511-4fec-40ec-bcac-eac772c3053d\") " pod="opendatahub/maas-controller-67b98965d5-xn4tt" Apr 20 13:40:48.571142 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:48.571118 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9c8n\" (UniqueName: \"kubernetes.io/projected/384fb511-4fec-40ec-bcac-eac772c3053d-kube-api-access-c9c8n\") pod \"maas-controller-67b98965d5-xn4tt\" (UID: \"384fb511-4fec-40ec-bcac-eac772c3053d\") " pod="opendatahub/maas-controller-67b98965d5-xn4tt" Apr 20 13:40:48.672333 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:48.672237 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67b98965d5-xn4tt" Apr 20 13:40:48.798145 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:48.798122 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-67b98965d5-xn4tt"] Apr 20 13:40:48.801069 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:40:48.801020 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod384fb511_4fec_40ec_bcac_eac772c3053d.slice/crio-1a5ec4ed2f106149320650b18fc059db982943fa0f8acbb5e1cf6b4a2ca649d3 WatchSource:0}: Error finding container 1a5ec4ed2f106149320650b18fc059db982943fa0f8acbb5e1cf6b4a2ca649d3: Status 404 returned error can't find the container with id 1a5ec4ed2f106149320650b18fc059db982943fa0f8acbb5e1cf6b4a2ca649d3 Apr 20 13:40:49.059575 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:49.059534 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67b98965d5-xn4tt" event={"ID":"384fb511-4fec-40ec-bcac-eac772c3053d","Type":"ContainerStarted","Data":"1a5ec4ed2f106149320650b18fc059db982943fa0f8acbb5e1cf6b4a2ca649d3"} Apr 20 13:40:50.065194 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:50.065158 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67b98965d5-xn4tt" event={"ID":"384fb511-4fec-40ec-bcac-eac772c3053d","Type":"ContainerStarted","Data":"b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0"} Apr 20 13:40:50.065579 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:50.065208 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-67b98965d5-xn4tt" Apr 20 13:40:50.087261 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:40:50.087208 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-67b98965d5-xn4tt" podStartSLOduration=1.791044721 podStartE2EDuration="2.087184782s" podCreationTimestamp="2026-04-20 13:40:48 +0000 UTC" firstStartedPulling="2026-04-20 13:40:48.802502753 +0000 UTC m=+604.366194003" lastFinishedPulling="2026-04-20 13:40:49.098642809 +0000 UTC m=+604.662334064" observedRunningTime="2026-04-20 13:40:50.084796815 +0000 UTC m=+605.648488085" watchObservedRunningTime="2026-04-20 13:40:50.087184782 +0000 UTC m=+605.650876052" Apr 20 13:41:01.074393 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:01.074312 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-67b98965d5-xn4tt" Apr 20 13:41:01.123622 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:01.123589 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5df788645d-ctftr"] Apr 20 13:41:01.123853 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:01.123815 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5df788645d-ctftr" podUID="d2aaef74-ae47-43f1-8c74-894d405e436c" containerName="manager" containerID="cri-o://33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859" gracePeriod=10 Apr 20 13:41:01.370827 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:01.370800 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5df788645d-ctftr" Apr 20 13:41:01.378566 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:01.378546 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6jjh\" (UniqueName: \"kubernetes.io/projected/d2aaef74-ae47-43f1-8c74-894d405e436c-kube-api-access-z6jjh\") pod \"d2aaef74-ae47-43f1-8c74-894d405e436c\" (UID: \"d2aaef74-ae47-43f1-8c74-894d405e436c\") " Apr 20 13:41:01.380820 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:01.380785 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2aaef74-ae47-43f1-8c74-894d405e436c-kube-api-access-z6jjh" (OuterVolumeSpecName: "kube-api-access-z6jjh") pod "d2aaef74-ae47-43f1-8c74-894d405e436c" (UID: "d2aaef74-ae47-43f1-8c74-894d405e436c"). InnerVolumeSpecName "kube-api-access-z6jjh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:41:01.479716 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:01.479683 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6jjh\" (UniqueName: \"kubernetes.io/projected/d2aaef74-ae47-43f1-8c74-894d405e436c-kube-api-access-z6jjh\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:41:02.104087 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:02.104037 2563 generic.go:358] "Generic (PLEG): container finished" podID="d2aaef74-ae47-43f1-8c74-894d405e436c" containerID="33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859" exitCode=0 Apr 20 13:41:02.104519 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:02.104121 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5df788645d-ctftr" Apr 20 13:41:02.104519 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:02.104130 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5df788645d-ctftr" event={"ID":"d2aaef74-ae47-43f1-8c74-894d405e436c","Type":"ContainerDied","Data":"33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859"} Apr 20 13:41:02.104519 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:02.104156 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5df788645d-ctftr" event={"ID":"d2aaef74-ae47-43f1-8c74-894d405e436c","Type":"ContainerDied","Data":"513e33bcc1ea884104b7179de29f7589ce66b3f9ac4b2754685e77319a7fcc87"} Apr 20 13:41:02.104519 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:02.104171 2563 scope.go:117] "RemoveContainer" containerID="33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859" Apr 20 13:41:02.112707 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:02.112686 2563 scope.go:117] "RemoveContainer" containerID="33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859" Apr 20 13:41:02.112980 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:41:02.112960 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859\": container with ID starting with 33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859 not found: ID does not exist" containerID="33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859" Apr 20 13:41:02.113079 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:02.112989 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859"} err="failed to get container status \"33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859\": rpc error: code = NotFound desc = could not find container \"33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859\": container with ID starting with 33c8e8909344e05e3e5f8ac880cd0613489211ba753b70e55705ddf448006859 not found: ID does not exist" Apr 20 13:41:02.149920 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:02.149878 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5df788645d-ctftr"] Apr 20 13:41:02.153372 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:02.153344 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5df788645d-ctftr"] Apr 20 13:41:03.066693 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:03.066661 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2aaef74-ae47-43f1-8c74-894d405e436c" path="/var/lib/kubelet/pods/d2aaef74-ae47-43f1-8c74-894d405e436c/volumes" Apr 20 13:41:15.256497 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.256456 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm"] Apr 20 13:41:15.256864 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.256823 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2aaef74-ae47-43f1-8c74-894d405e436c" containerName="manager" Apr 20 13:41:15.256864 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.256834 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2aaef74-ae47-43f1-8c74-894d405e436c" containerName="manager" Apr 20 13:41:15.256941 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.256907 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2aaef74-ae47-43f1-8c74-894d405e436c" containerName="manager" Apr 20 13:41:15.261859 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.261837 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.265305 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.265283 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 13:41:15.265439 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.265329 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 13:41:15.265439 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.265353 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-kngc7\"" Apr 20 13:41:15.265439 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.265357 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 13:41:15.271866 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.271842 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm"] Apr 20 13:41:15.303959 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.303928 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/313854e2-6318-47d4-9e66-540042901191-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.304158 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.304072 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.304158 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.304126 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.304264 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.304153 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svbmp\" (UniqueName: \"kubernetes.io/projected/313854e2-6318-47d4-9e66-540042901191-kube-api-access-svbmp\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.304264 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.304193 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.304264 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.304224 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.405686 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.405644 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.405868 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.405706 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.405868 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.405738 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svbmp\" (UniqueName: \"kubernetes.io/projected/313854e2-6318-47d4-9e66-540042901191-kube-api-access-svbmp\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.406207 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.406173 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.406357 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.406340 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.406480 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.406466 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/313854e2-6318-47d4-9e66-540042901191-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.406573 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.406549 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.406971 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.406349 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.407148 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.406925 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.410069 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.409383 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/313854e2-6318-47d4-9e66-540042901191-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.410069 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.409682 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/313854e2-6318-47d4-9e66-540042901191-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.413558 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.413534 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svbmp\" (UniqueName: \"kubernetes.io/projected/313854e2-6318-47d4-9e66-540042901191-kube-api-access-svbmp\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm\" (UID: \"313854e2-6318-47d4-9e66-540042901191\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.573098 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.573029 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:15.724467 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:15.723956 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm"] Apr 20 13:41:15.726557 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:41:15.726524 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod313854e2_6318_47d4_9e66_540042901191.slice/crio-d07b18fc3961ce789ab13238e3a4096ba7d09a805b0cdfc30b9721a67d8cdcf8 WatchSource:0}: Error finding container d07b18fc3961ce789ab13238e3a4096ba7d09a805b0cdfc30b9721a67d8cdcf8: Status 404 returned error can't find the container with id d07b18fc3961ce789ab13238e3a4096ba7d09a805b0cdfc30b9721a67d8cdcf8 Apr 20 13:41:16.151965 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:16.151921 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" event={"ID":"313854e2-6318-47d4-9e66-540042901191","Type":"ContainerStarted","Data":"d07b18fc3961ce789ab13238e3a4096ba7d09a805b0cdfc30b9721a67d8cdcf8"} Apr 20 13:41:23.179998 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:23.179898 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" event={"ID":"313854e2-6318-47d4-9e66-540042901191","Type":"ContainerStarted","Data":"f650de477e90f465bca7f10742e95e25d74dfd89a56dd402656bb76f219796f3"} Apr 20 13:41:29.202237 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:29.202196 2563 generic.go:358] "Generic (PLEG): container finished" podID="313854e2-6318-47d4-9e66-540042901191" containerID="f650de477e90f465bca7f10742e95e25d74dfd89a56dd402656bb76f219796f3" exitCode=0 Apr 20 13:41:29.202726 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:29.202304 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" event={"ID":"313854e2-6318-47d4-9e66-540042901191","Type":"ContainerDied","Data":"f650de477e90f465bca7f10742e95e25d74dfd89a56dd402656bb76f219796f3"} Apr 20 13:41:31.216988 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.216943 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" event={"ID":"313854e2-6318-47d4-9e66-540042901191","Type":"ContainerStarted","Data":"161139afb41e7984aae7c5642af5487ad83073494e00319bccd5ee551d3faeb1"} Apr 20 13:41:31.217574 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.217190 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:31.236368 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.236313 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" podStartSLOduration=1.534003385 podStartE2EDuration="16.236277232s" podCreationTimestamp="2026-04-20 13:41:15 +0000 UTC" firstStartedPulling="2026-04-20 13:41:15.728569033 +0000 UTC m=+631.292260283" lastFinishedPulling="2026-04-20 13:41:30.430842882 +0000 UTC m=+645.994534130" observedRunningTime="2026-04-20 13:41:31.234463704 +0000 UTC m=+646.798154998" watchObservedRunningTime="2026-04-20 13:41:31.236277232 +0000 UTC m=+646.799968502" Apr 20 13:41:31.344624 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.344592 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-667696b866-qpj9f"] Apr 20 13:41:31.344855 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.344833 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-667696b866-qpj9f" podUID="2f414fbc-fa7c-4383-9b73-84ed5b26ea30" containerName="maas-api" containerID="cri-o://cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c" gracePeriod=30 Apr 20 13:41:31.585137 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.585113 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:41:31.656373 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.656337 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbz9z\" (UniqueName: \"kubernetes.io/projected/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-kube-api-access-nbz9z\") pod \"2f414fbc-fa7c-4383-9b73-84ed5b26ea30\" (UID: \"2f414fbc-fa7c-4383-9b73-84ed5b26ea30\") " Apr 20 13:41:31.656373 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.656374 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-maas-api-tls\") pod \"2f414fbc-fa7c-4383-9b73-84ed5b26ea30\" (UID: \"2f414fbc-fa7c-4383-9b73-84ed5b26ea30\") " Apr 20 13:41:31.658600 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.658566 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "2f414fbc-fa7c-4383-9b73-84ed5b26ea30" (UID: "2f414fbc-fa7c-4383-9b73-84ed5b26ea30"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 13:41:31.658720 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.658611 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-kube-api-access-nbz9z" (OuterVolumeSpecName: "kube-api-access-nbz9z") pod "2f414fbc-fa7c-4383-9b73-84ed5b26ea30" (UID: "2f414fbc-fa7c-4383-9b73-84ed5b26ea30"). InnerVolumeSpecName "kube-api-access-nbz9z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:41:31.756943 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.756906 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nbz9z\" (UniqueName: \"kubernetes.io/projected/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-kube-api-access-nbz9z\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:41:31.756943 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:31.756935 2563 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2f414fbc-fa7c-4383-9b73-84ed5b26ea30-maas-api-tls\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:41:32.221691 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:32.221657 2563 generic.go:358] "Generic (PLEG): container finished" podID="2f414fbc-fa7c-4383-9b73-84ed5b26ea30" containerID="cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c" exitCode=0 Apr 20 13:41:32.222168 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:32.221716 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-667696b866-qpj9f" Apr 20 13:41:32.222168 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:32.221736 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-667696b866-qpj9f" event={"ID":"2f414fbc-fa7c-4383-9b73-84ed5b26ea30","Type":"ContainerDied","Data":"cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c"} Apr 20 13:41:32.222168 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:32.221776 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-667696b866-qpj9f" event={"ID":"2f414fbc-fa7c-4383-9b73-84ed5b26ea30","Type":"ContainerDied","Data":"5845ef6d5b82404c5a62ab1e6e265dd4e953072c785d35e4e12811b250f3c880"} Apr 20 13:41:32.222168 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:32.221794 2563 scope.go:117] "RemoveContainer" containerID="cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c" Apr 20 13:41:32.230692 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:32.230671 2563 scope.go:117] "RemoveContainer" containerID="cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c" Apr 20 13:41:32.230994 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:41:32.230974 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c\": container with ID starting with cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c not found: ID does not exist" containerID="cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c" Apr 20 13:41:32.231085 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:32.231007 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c"} err="failed to get container status \"cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c\": rpc error: code = NotFound desc = could not find container \"cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c\": container with ID starting with cfe2c9563a56dd70f0165dac57e8818ffb3f5f64203e2e0bd41a722e672d5f2c not found: ID does not exist" Apr 20 13:41:32.245463 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:32.245432 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-667696b866-qpj9f"] Apr 20 13:41:32.248613 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:32.248583 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-667696b866-qpj9f"] Apr 20 13:41:33.070123 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:33.070094 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f414fbc-fa7c-4383-9b73-84ed5b26ea30" path="/var/lib/kubelet/pods/2f414fbc-fa7c-4383-9b73-84ed5b26ea30/volumes" Apr 20 13:41:42.235429 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:42.235391 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm" Apr 20 13:41:47.263745 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.263707 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt"] Apr 20 13:41:47.264424 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.264386 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f414fbc-fa7c-4383-9b73-84ed5b26ea30" containerName="maas-api" Apr 20 13:41:47.264424 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.264410 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f414fbc-fa7c-4383-9b73-84ed5b26ea30" containerName="maas-api" Apr 20 13:41:47.264626 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.264498 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f414fbc-fa7c-4383-9b73-84ed5b26ea30" containerName="maas-api" Apr 20 13:41:47.269784 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.269762 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.272983 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.272960 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 13:41:47.278954 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.278930 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt"] Apr 20 13:41:47.296127 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.296097 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4689a1e9-269e-4939-b965-462e12b18791-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.296290 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.296192 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gctg\" (UniqueName: \"kubernetes.io/projected/4689a1e9-269e-4939-b965-462e12b18791-kube-api-access-7gctg\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.296290 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.296225 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.296290 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.296257 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.296460 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.296344 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.296460 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.296399 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.397325 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.397283 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gctg\" (UniqueName: \"kubernetes.io/projected/4689a1e9-269e-4939-b965-462e12b18791-kube-api-access-7gctg\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.397325 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.397332 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.397585 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.397363 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.397585 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.397436 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.397585 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.397478 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.397585 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.397504 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4689a1e9-269e-4939-b965-462e12b18791-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.397854 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.397823 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.397970 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.397866 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.397970 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.397905 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.399981 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.399961 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/4689a1e9-269e-4939-b965-462e12b18791-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.400179 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.400162 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4689a1e9-269e-4939-b965-462e12b18791-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.406874 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.406848 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gctg\" (UniqueName: \"kubernetes.io/projected/4689a1e9-269e-4939-b965-462e12b18791-kube-api-access-7gctg\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt\" (UID: \"4689a1e9-269e-4939-b965-462e12b18791\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.582473 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.582435 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:47.723733 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:47.723708 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt"] Apr 20 13:41:47.726023 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:41:47.725993 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4689a1e9_269e_4939_b965_462e12b18791.slice/crio-01ddb39ae1f084b15a8838fafc4b2630065335fb9ed28b714d1439a1945788c5 WatchSource:0}: Error finding container 01ddb39ae1f084b15a8838fafc4b2630065335fb9ed28b714d1439a1945788c5: Status 404 returned error can't find the container with id 01ddb39ae1f084b15a8838fafc4b2630065335fb9ed28b714d1439a1945788c5 Apr 20 13:41:48.282152 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:48.282108 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" event={"ID":"4689a1e9-269e-4939-b965-462e12b18791","Type":"ContainerStarted","Data":"c6883f48ba2ada73574767b782b6df363b3f1c9ee5cbf4d3219fbdd67eb820ca"} Apr 20 13:41:48.282152 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:48.282153 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" event={"ID":"4689a1e9-269e-4939-b965-462e12b18791","Type":"ContainerStarted","Data":"01ddb39ae1f084b15a8838fafc4b2630065335fb9ed28b714d1439a1945788c5"} Apr 20 13:41:54.304834 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:54.304793 2563 generic.go:358] "Generic (PLEG): container finished" podID="4689a1e9-269e-4939-b965-462e12b18791" containerID="c6883f48ba2ada73574767b782b6df363b3f1c9ee5cbf4d3219fbdd67eb820ca" exitCode=0 Apr 20 13:41:54.305387 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:54.304847 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" event={"ID":"4689a1e9-269e-4939-b965-462e12b18791","Type":"ContainerDied","Data":"c6883f48ba2ada73574767b782b6df363b3f1c9ee5cbf4d3219fbdd67eb820ca"} Apr 20 13:41:55.311806 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:55.311764 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" event={"ID":"4689a1e9-269e-4939-b965-462e12b18791","Type":"ContainerStarted","Data":"2cf802c245597e54bf71c912f07ac4ddf99cfa7cd3f9fda1ce59284ff5fd2365"} Apr 20 13:41:55.312262 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:55.312035 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:41:55.330697 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:41:55.330637 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" podStartSLOduration=8.055516515 podStartE2EDuration="8.330620179s" podCreationTimestamp="2026-04-20 13:41:47 +0000 UTC" firstStartedPulling="2026-04-20 13:41:54.305616382 +0000 UTC m=+669.869307633" lastFinishedPulling="2026-04-20 13:41:54.580720049 +0000 UTC m=+670.144411297" observedRunningTime="2026-04-20 13:41:55.330188607 +0000 UTC m=+670.893879889" watchObservedRunningTime="2026-04-20 13:41:55.330620179 +0000 UTC m=+670.894311450" Apr 20 13:42:00.348792 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.348758 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr"] Apr 20 13:42:00.352493 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.352468 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.354838 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.354816 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 13:42:00.363764 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.363732 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr"] Apr 20 13:42:00.407283 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.407239 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b26f840-e056-4542-8c4b-688c2f587335-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.407479 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.407301 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.407479 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.407346 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb5l5\" (UniqueName: \"kubernetes.io/projected/6b26f840-e056-4542-8c4b-688c2f587335-kube-api-access-qb5l5\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.407479 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.407377 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.407653 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.407512 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.407653 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.407597 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.508621 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.508576 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qb5l5\" (UniqueName: \"kubernetes.io/projected/6b26f840-e056-4542-8c4b-688c2f587335-kube-api-access-qb5l5\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.508822 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.508628 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.508822 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.508675 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.508822 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.508730 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.508822 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.508789 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b26f840-e056-4542-8c4b-688c2f587335-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.509034 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.508822 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.509313 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.509281 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.509461 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.509379 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.509461 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.509424 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.511621 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.511597 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6b26f840-e056-4542-8c4b-688c2f587335-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.511802 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.511786 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6b26f840-e056-4542-8c4b-688c2f587335-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.516755 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.516728 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb5l5\" (UniqueName: \"kubernetes.io/projected/6b26f840-e056-4542-8c4b-688c2f587335-kube-api-access-qb5l5\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr\" (UID: \"6b26f840-e056-4542-8c4b-688c2f587335\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.643284 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.643193 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf"] Apr 20 13:42:00.646831 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.646813 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.649339 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.649317 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 13:42:00.656934 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.656906 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf"] Apr 20 13:42:00.663377 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.663344 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:00.710855 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.710454 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.710855 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.710515 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.710855 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.710572 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/378404bc-807e-4e3c-b92e-cf621de41cbf-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.710855 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.710631 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzv82\" (UniqueName: \"kubernetes.io/projected/378404bc-807e-4e3c-b92e-cf621de41cbf-kube-api-access-bzv82\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.710855 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.710684 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.710855 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.710711 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.807895 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.807856 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr"] Apr 20 13:42:00.812090 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.812027 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.812249 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.812097 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.812249 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.812177 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.812249 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.812211 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.812417 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.812268 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/378404bc-807e-4e3c-b92e-cf621de41cbf-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.812417 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.812295 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bzv82\" (UniqueName: \"kubernetes.io/projected/378404bc-807e-4e3c-b92e-cf621de41cbf-kube-api-access-bzv82\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.812571 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.812538 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.812987 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.812965 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.813233 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.813207 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.814649 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.814623 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/378404bc-807e-4e3c-b92e-cf621de41cbf-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.815850 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.815827 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/378404bc-807e-4e3c-b92e-cf621de41cbf-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.816760 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:42:00.816737 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b26f840_e056_4542_8c4b_688c2f587335.slice/crio-3185d834c54bc85aa29bd6a5749b194e0d891fd758a20f32560de74bdcc80a75 WatchSource:0}: Error finding container 3185d834c54bc85aa29bd6a5749b194e0d891fd758a20f32560de74bdcc80a75: Status 404 returned error can't find the container with id 3185d834c54bc85aa29bd6a5749b194e0d891fd758a20f32560de74bdcc80a75 Apr 20 13:42:00.822704 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.822489 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzv82\" (UniqueName: \"kubernetes.io/projected/378404bc-807e-4e3c-b92e-cf621de41cbf-kube-api-access-bzv82\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf\" (UID: \"378404bc-807e-4e3c-b92e-cf621de41cbf\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:00.960519 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:00.960482 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:01.107535 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:01.107501 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf"] Apr 20 13:42:01.111757 ip-10-0-132-232 kubenswrapper[2563]: W0420 13:42:01.111717 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod378404bc_807e_4e3c_b92e_cf621de41cbf.slice/crio-c4561af8fa0bd8c4cfa56814324d023c1c01fed9b30e1a938a86e2cbad0b71af WatchSource:0}: Error finding container c4561af8fa0bd8c4cfa56814324d023c1c01fed9b30e1a938a86e2cbad0b71af: Status 404 returned error can't find the container with id c4561af8fa0bd8c4cfa56814324d023c1c01fed9b30e1a938a86e2cbad0b71af Apr 20 13:42:01.335482 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:01.335441 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" event={"ID":"6b26f840-e056-4542-8c4b-688c2f587335","Type":"ContainerStarted","Data":"386dcd554a9170d2a0b4ee8851efad596144dbbe65083ffc31b53028a6b04aa0"} Apr 20 13:42:01.335789 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:01.335490 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" event={"ID":"6b26f840-e056-4542-8c4b-688c2f587335","Type":"ContainerStarted","Data":"3185d834c54bc85aa29bd6a5749b194e0d891fd758a20f32560de74bdcc80a75"} Apr 20 13:42:01.337030 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:01.336999 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" event={"ID":"378404bc-807e-4e3c-b92e-cf621de41cbf","Type":"ContainerStarted","Data":"83e2df3a0676e1aded437a0c02bb02e890f8a273a2fc6f06f2583f1f2edcb243"} Apr 20 13:42:01.337171 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:01.337036 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" event={"ID":"378404bc-807e-4e3c-b92e-cf621de41cbf","Type":"ContainerStarted","Data":"c4561af8fa0bd8c4cfa56814324d023c1c01fed9b30e1a938a86e2cbad0b71af"} Apr 20 13:42:06.328706 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:06.328674 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt" Apr 20 13:42:07.369306 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:07.369271 2563 generic.go:358] "Generic (PLEG): container finished" podID="378404bc-807e-4e3c-b92e-cf621de41cbf" containerID="83e2df3a0676e1aded437a0c02bb02e890f8a273a2fc6f06f2583f1f2edcb243" exitCode=0 Apr 20 13:42:07.369743 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:07.369344 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" event={"ID":"378404bc-807e-4e3c-b92e-cf621de41cbf","Type":"ContainerDied","Data":"83e2df3a0676e1aded437a0c02bb02e890f8a273a2fc6f06f2583f1f2edcb243"} Apr 20 13:42:08.376037 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:08.375999 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" event={"ID":"378404bc-807e-4e3c-b92e-cf621de41cbf","Type":"ContainerStarted","Data":"184f07b93b320b14db81f1101a4038d1023d65222fb95018807188d21df6aac2"} Apr 20 13:42:08.376539 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:08.376294 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:08.397981 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:08.397903 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" podStartSLOduration=8.146945269 podStartE2EDuration="8.397883335s" podCreationTimestamp="2026-04-20 13:42:00 +0000 UTC" firstStartedPulling="2026-04-20 13:42:07.370168255 +0000 UTC m=+682.933859506" lastFinishedPulling="2026-04-20 13:42:07.621106322 +0000 UTC m=+683.184797572" observedRunningTime="2026-04-20 13:42:08.394723014 +0000 UTC m=+683.958414283" watchObservedRunningTime="2026-04-20 13:42:08.397883335 +0000 UTC m=+683.961574688" Apr 20 13:42:10.385265 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:10.385226 2563 generic.go:358] "Generic (PLEG): container finished" podID="6b26f840-e056-4542-8c4b-688c2f587335" containerID="386dcd554a9170d2a0b4ee8851efad596144dbbe65083ffc31b53028a6b04aa0" exitCode=0 Apr 20 13:42:10.385265 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:10.385268 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" event={"ID":"6b26f840-e056-4542-8c4b-688c2f587335","Type":"ContainerDied","Data":"386dcd554a9170d2a0b4ee8851efad596144dbbe65083ffc31b53028a6b04aa0"} Apr 20 13:42:11.390400 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:11.390361 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" event={"ID":"6b26f840-e056-4542-8c4b-688c2f587335","Type":"ContainerStarted","Data":"0bed2f4fedeee49bf5b8a2e8726e7609cd4899fa578560f459a22f6da1824c9a"} Apr 20 13:42:11.390828 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:11.390580 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:42:11.411372 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:11.411312 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" podStartSLOduration=11.200267412 podStartE2EDuration="11.411291756s" podCreationTimestamp="2026-04-20 13:42:00 +0000 UTC" firstStartedPulling="2026-04-20 13:42:10.385876743 +0000 UTC m=+685.949567995" lastFinishedPulling="2026-04-20 13:42:10.596901088 +0000 UTC m=+686.160592339" observedRunningTime="2026-04-20 13:42:11.408708049 +0000 UTC m=+686.972399322" watchObservedRunningTime="2026-04-20 13:42:11.411291756 +0000 UTC m=+686.974983027" Apr 20 13:42:19.393165 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:19.393132 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf" Apr 20 13:42:22.408524 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:42:22.408482 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr" Apr 20 13:44:11.916477 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:11.916381 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-67b98965d5-xn4tt"] Apr 20 13:44:11.916987 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:11.916653 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-67b98965d5-xn4tt" podUID="384fb511-4fec-40ec-bcac-eac772c3053d" containerName="manager" containerID="cri-o://b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0" gracePeriod=10 Apr 20 13:44:12.175136 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.175035 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67b98965d5-xn4tt" Apr 20 13:44:12.209862 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.209824 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9c8n\" (UniqueName: \"kubernetes.io/projected/384fb511-4fec-40ec-bcac-eac772c3053d-kube-api-access-c9c8n\") pod \"384fb511-4fec-40ec-bcac-eac772c3053d\" (UID: \"384fb511-4fec-40ec-bcac-eac772c3053d\") " Apr 20 13:44:12.212081 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.212025 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/384fb511-4fec-40ec-bcac-eac772c3053d-kube-api-access-c9c8n" (OuterVolumeSpecName: "kube-api-access-c9c8n") pod "384fb511-4fec-40ec-bcac-eac772c3053d" (UID: "384fb511-4fec-40ec-bcac-eac772c3053d"). InnerVolumeSpecName "kube-api-access-c9c8n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 13:44:12.310805 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.310767 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c9c8n\" (UniqueName: \"kubernetes.io/projected/384fb511-4fec-40ec-bcac-eac772c3053d-kube-api-access-c9c8n\") on node \"ip-10-0-132-232.ec2.internal\" DevicePath \"\"" Apr 20 13:44:12.801769 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.801735 2563 generic.go:358] "Generic (PLEG): container finished" podID="384fb511-4fec-40ec-bcac-eac772c3053d" containerID="b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0" exitCode=0 Apr 20 13:44:12.801966 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.801789 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67b98965d5-xn4tt" event={"ID":"384fb511-4fec-40ec-bcac-eac772c3053d","Type":"ContainerDied","Data":"b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0"} Apr 20 13:44:12.801966 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.801795 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-67b98965d5-xn4tt" Apr 20 13:44:12.801966 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.801811 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-67b98965d5-xn4tt" event={"ID":"384fb511-4fec-40ec-bcac-eac772c3053d","Type":"ContainerDied","Data":"1a5ec4ed2f106149320650b18fc059db982943fa0f8acbb5e1cf6b4a2ca649d3"} Apr 20 13:44:12.801966 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.801826 2563 scope.go:117] "RemoveContainer" containerID="b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0" Apr 20 13:44:12.811764 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.811746 2563 scope.go:117] "RemoveContainer" containerID="b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0" Apr 20 13:44:12.812010 ip-10-0-132-232 kubenswrapper[2563]: E0420 13:44:12.811992 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0\": container with ID starting with b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0 not found: ID does not exist" containerID="b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0" Apr 20 13:44:12.812090 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.812019 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0"} err="failed to get container status \"b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0\": rpc error: code = NotFound desc = could not find container \"b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0\": container with ID starting with b79c9851cb3ae824e4335b2890cbedd8645369ddd1c25ac1fb29a7754e2e1cf0 not found: ID does not exist" Apr 20 13:44:12.824782 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.824753 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-67b98965d5-xn4tt"] Apr 20 13:44:12.828519 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:12.828494 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-67b98965d5-xn4tt"] Apr 20 13:44:13.065982 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:44:13.065902 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="384fb511-4fec-40ec-bcac-eac772c3053d" path="/var/lib/kubelet/pods/384fb511-4fec-40ec-bcac-eac772c3053d/volumes" Apr 20 13:45:45.009025 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:45:45.008996 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:45:45.009957 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:45:45.009940 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:50:45.034723 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:50:45.034694 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:50:45.035343 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:50:45.035113 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:55:45.057987 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:55:45.057951 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 13:55:45.060438 ip-10-0-132-232 kubenswrapper[2563]: I0420 13:55:45.060141 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 14:00:45.082581 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:00:45.082470 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 14:00:45.085814 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:00:45.085310 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 14:01:38.288599 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:38.288564 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-46dfm_12f63628-da5a-4c24-b955-45c6549f5e08/manager/0.log" Apr 20 14:01:38.680972 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:38.680889 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-scs84_090657bd-bd1c-4a9c-965a-1e6543166a2a/manager/0.log" Apr 20 14:01:38.933897 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:38.933806 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8cd4c57cb-ms4qc_c062ebea-ac1f-4302-8c43-6f2082de5b19/manager/0.log" Apr 20 14:01:39.158830 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:39.158799 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-kxs52_f5e08680-837e-42b3-906f-4546cdc8ff8f/postgres/0.log" Apr 20 14:01:40.560142 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:40.560114 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-pmpw4_f9d2b9e6-4f0b-4645-9b13-4f7e53c18c72/manager/0.log" Apr 20 14:01:41.713963 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:41.713931 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-t28m6_02780b87-93fa-4e8b-8727-f2b1580ff6ee/discovery/0.log" Apr 20 14:01:41.935887 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:41.935853 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-54c65669-xrgmc_3c21b573-c73e-4d3c-bac7-da1338a4aa40/kube-auth-proxy/0.log" Apr 20 14:01:42.047687 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:42.047652 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-nwrxv_774a70ec-4f0b-4ad0-b0fa-7bae4bf52465/istio-proxy/0.log" Apr 20 14:01:42.617533 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:42.617497 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm_313854e2-6318-47d4-9e66-540042901191/storage-initializer/0.log" Apr 20 14:01:42.624781 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:42.624756 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-dkwxm_313854e2-6318-47d4-9e66-540042901191/main/0.log" Apr 20 14:01:42.850891 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:42.850863 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf_378404bc-807e-4e3c-b92e-cf621de41cbf/main/0.log" Apr 20 14:01:42.856884 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:42.856866 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb2wtf_378404bc-807e-4e3c-b92e-cf621de41cbf/storage-initializer/0.log" Apr 20 14:01:42.972722 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:42.972639 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt_4689a1e9-269e-4939-b965-462e12b18791/storage-initializer/0.log" Apr 20 14:01:42.979512 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:42.979487 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-cq4wt_4689a1e9-269e-4939-b965-462e12b18791/main/0.log" Apr 20 14:01:43.084848 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:43.084817 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr_6b26f840-e056-4542-8c4b-688c2f587335/main/0.log" Apr 20 14:01:43.092121 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:43.092092 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-zcfbr_6b26f840-e056-4542-8c4b-688c2f587335/storage-initializer/0.log" Apr 20 14:01:50.497774 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:50.497741 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ksm7s_68a134ea-4533-4317-bf3f-b4e22e808c81/global-pull-secret-syncer/0.log" Apr 20 14:01:50.547500 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:50.547470 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-l7wx7_22136a88-f60c-4f20-8b96-f6af8da37f19/konnectivity-agent/0.log" Apr 20 14:01:50.630837 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:50.630809 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-232.ec2.internal_0ea90522398b66e089408acd5ec34cb0/haproxy/0.log" Apr 20 14:01:55.012088 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:55.012014 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-pmpw4_f9d2b9e6-4f0b-4645-9b13-4f7e53c18c72/manager/0.log" Apr 20 14:01:57.212604 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:57.212573 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wxck5_d8055fd5-cb97-495a-88d8-f9b6239b99f3/node-exporter/0.log" Apr 20 14:01:57.232234 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:57.232209 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wxck5_d8055fd5-cb97-495a-88d8-f9b6239b99f3/kube-rbac-proxy/0.log" Apr 20 14:01:57.254794 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:57.254769 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wxck5_d8055fd5-cb97-495a-88d8-f9b6239b99f3/init-textfile/0.log" Apr 20 14:01:58.785399 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.785365 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh"] Apr 20 14:01:58.785816 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.785739 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="384fb511-4fec-40ec-bcac-eac772c3053d" containerName="manager" Apr 20 14:01:58.785816 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.785751 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="384fb511-4fec-40ec-bcac-eac772c3053d" containerName="manager" Apr 20 14:01:58.785816 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.785816 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="384fb511-4fec-40ec-bcac-eac772c3053d" containerName="manager" Apr 20 14:01:58.789019 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.788996 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.791438 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.791407 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7jq8s\"/\"openshift-service-ca.crt\"" Apr 20 14:01:58.792368 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.792350 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7jq8s\"/\"kube-root-ca.crt\"" Apr 20 14:01:58.792469 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.792352 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7jq8s\"/\"default-dockercfg-w22fr\"" Apr 20 14:01:58.799258 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.799231 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh"] Apr 20 14:01:58.844313 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.844280 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-lib-modules\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.844313 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.844321 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-podres\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.844545 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.844347 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-sys\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.844545 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.844393 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-proc\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.844545 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.844469 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wflgl\" (UniqueName: \"kubernetes.io/projected/664711e1-d156-4b14-afc7-6de61b91ce2b-kube-api-access-wflgl\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.945264 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.945211 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-lib-modules\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.945264 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.945284 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-podres\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.945526 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.945315 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-sys\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.945526 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.945339 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-proc\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.945526 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.945413 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-proc\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.945526 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.945413 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-lib-modules\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.945526 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.945443 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-podres\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.945526 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.945410 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wflgl\" (UniqueName: \"kubernetes.io/projected/664711e1-d156-4b14-afc7-6de61b91ce2b-kube-api-access-wflgl\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.945526 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.945449 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/664711e1-d156-4b14-afc7-6de61b91ce2b-sys\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:58.953854 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:58.953812 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wflgl\" (UniqueName: \"kubernetes.io/projected/664711e1-d156-4b14-afc7-6de61b91ce2b-kube-api-access-wflgl\") pod \"perf-node-gather-daemonset-qfjfh\" (UID: \"664711e1-d156-4b14-afc7-6de61b91ce2b\") " pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:59.100728 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:59.100625 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:59.230588 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:59.230541 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh"] Apr 20 14:01:59.233577 ip-10-0-132-232 kubenswrapper[2563]: W0420 14:01:59.233546 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod664711e1_d156_4b14_afc7_6de61b91ce2b.slice/crio-19e983d4792255e2aab48cd40c88d4beecf7b5ae1d2a264b16142471df48aa6c WatchSource:0}: Error finding container 19e983d4792255e2aab48cd40c88d4beecf7b5ae1d2a264b16142471df48aa6c: Status 404 returned error can't find the container with id 19e983d4792255e2aab48cd40c88d4beecf7b5ae1d2a264b16142471df48aa6c Apr 20 14:01:59.235373 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:59.235352 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:01:59.487635 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:59.487537 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" event={"ID":"664711e1-d156-4b14-afc7-6de61b91ce2b","Type":"ContainerStarted","Data":"733df7fa8fd883338ba8829184e306fc957005855e2032376623d764bcf67d99"} Apr 20 14:01:59.487635 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:59.487585 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" event={"ID":"664711e1-d156-4b14-afc7-6de61b91ce2b","Type":"ContainerStarted","Data":"19e983d4792255e2aab48cd40c88d4beecf7b5ae1d2a264b16142471df48aa6c"} Apr 20 14:01:59.487635 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:59.487620 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:01:59.504358 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:01:59.504299 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" podStartSLOduration=1.504277764 podStartE2EDuration="1.504277764s" podCreationTimestamp="2026-04-20 14:01:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:01:59.503590169 +0000 UTC m=+1875.067281439" watchObservedRunningTime="2026-04-20 14:01:59.504277764 +0000 UTC m=+1875.067969036" Apr 20 14:02:00.397191 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:00.397164 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-gzsss_74ecab64-c701-4519-98e0-66ade918c111/volume-data-source-validator/0.log" Apr 20 14:02:01.203438 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:01.203406 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jq28g_5fd3c7eb-0c40-4911-a067-10cda31de0d7/dns/0.log" Apr 20 14:02:01.222346 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:01.222322 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jq28g_5fd3c7eb-0c40-4911-a067-10cda31de0d7/kube-rbac-proxy/0.log" Apr 20 14:02:01.378911 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:01.378880 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rxqc7_8ca98088-8b65-4efe-ad4e-3df5a8fe02b5/dns-node-resolver/0.log" Apr 20 14:02:01.829384 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:01.829353 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5844f64979-4t7rs_74c7d945-ac32-4efe-ba31-a3bd50d2d706/registry/0.log" Apr 20 14:02:01.846685 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:01.846658 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hw898_b1a9fc24-9a0e-4d24-aa45-ec1711e1399a/node-ca/0.log" Apr 20 14:02:02.800511 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:02.800478 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-t28m6_02780b87-93fa-4e8b-8727-f2b1580ff6ee/discovery/0.log" Apr 20 14:02:02.842706 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:02.842678 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-54c65669-xrgmc_3c21b573-c73e-4d3c-bac7-da1338a4aa40/kube-auth-proxy/0.log" Apr 20 14:02:02.892799 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:02.892764 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-nwrxv_774a70ec-4f0b-4ad0-b0fa-7bae4bf52465/istio-proxy/0.log" Apr 20 14:02:03.450350 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:03.450321 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lw64p_8c0036f9-a08e-4eac-8ad7-301fe4765604/serve-healthcheck-canary/0.log" Apr 20 14:02:03.867091 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:03.867018 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-qrddw_5f9cc52f-4998-4934-a999-16ee91bf3d4a/insights-operator/0.log" Apr 20 14:02:03.868598 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:03.868577 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-qrddw_5f9cc52f-4998-4934-a999-16ee91bf3d4a/insights-operator/1.log" Apr 20 14:02:04.015147 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:04.015116 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qdwz4_b2049f90-5ce4-4282-9210-29370c7e0bda/kube-rbac-proxy/0.log" Apr 20 14:02:04.033823 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:04.033800 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qdwz4_b2049f90-5ce4-4282-9210-29370c7e0bda/exporter/0.log" Apr 20 14:02:04.052663 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:04.052633 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qdwz4_b2049f90-5ce4-4282-9210-29370c7e0bda/extractor/0.log" Apr 20 14:02:05.502223 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:05.502193 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7jq8s/perf-node-gather-daemonset-qfjfh" Apr 20 14:02:05.914147 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:05.914113 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-46dfm_12f63628-da5a-4c24-b955-45c6549f5e08/manager/0.log" Apr 20 14:02:06.008882 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:06.008851 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-scs84_090657bd-bd1c-4a9c-965a-1e6543166a2a/manager/0.log" Apr 20 14:02:06.087267 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:06.087238 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8cd4c57cb-ms4qc_c062ebea-ac1f-4302-8c43-6f2082de5b19/manager/0.log" Apr 20 14:02:06.134943 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:06.134914 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-kxs52_f5e08680-837e-42b3-906f-4546cdc8ff8f/postgres/0.log" Apr 20 14:02:13.431311 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:13.431275 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwkrd_f22ff795-52da-4095-9d35-f9d44f2b8239/kube-multus-additional-cni-plugins/0.log" Apr 20 14:02:13.453040 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:13.453008 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwkrd_f22ff795-52da-4095-9d35-f9d44f2b8239/egress-router-binary-copy/0.log" Apr 20 14:02:13.473221 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:13.473197 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwkrd_f22ff795-52da-4095-9d35-f9d44f2b8239/cni-plugins/0.log" Apr 20 14:02:13.494070 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:13.494023 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwkrd_f22ff795-52da-4095-9d35-f9d44f2b8239/bond-cni-plugin/0.log" Apr 20 14:02:13.514015 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:13.513988 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwkrd_f22ff795-52da-4095-9d35-f9d44f2b8239/routeoverride-cni/0.log" Apr 20 14:02:13.534144 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:13.534118 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwkrd_f22ff795-52da-4095-9d35-f9d44f2b8239/whereabouts-cni-bincopy/0.log" Apr 20 14:02:13.553545 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:13.553523 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-xwkrd_f22ff795-52da-4095-9d35-f9d44f2b8239/whereabouts-cni/0.log" Apr 20 14:02:13.579961 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:13.579932 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bg8cz_99cf3e2c-0587-4d53-ae7a-4dfaea501010/kube-multus/0.log" Apr 20 14:02:13.669396 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:13.669365 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5pkrd_02041a2f-e9fd-4902-a9a4-47e4cd2889e4/network-metrics-daemon/0.log" Apr 20 14:02:13.689749 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:13.689676 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-5pkrd_02041a2f-e9fd-4902-a9a4-47e4cd2889e4/kube-rbac-proxy/0.log" Apr 20 14:02:14.797384 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:14.797350 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-controller/0.log" Apr 20 14:02:14.818440 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:14.818412 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/0.log" Apr 20 14:02:14.826481 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:14.826453 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovn-acl-logging/1.log" Apr 20 14:02:14.846078 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:14.846016 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/kube-rbac-proxy-node/0.log" Apr 20 14:02:14.869345 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:14.869311 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 14:02:14.892600 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:14.892578 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/northd/0.log" Apr 20 14:02:14.915168 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:14.915135 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/nbdb/0.log" Apr 20 14:02:14.935322 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:14.935296 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/sbdb/0.log" Apr 20 14:02:15.035863 ip-10-0-132-232 kubenswrapper[2563]: I0420 14:02:15.035813 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mp62t_69f16c24-4d9a-4565-82a7-dbe15561755e/ovnkube-controller/0.log"