Apr 22 21:09:02.549662 ip-10-0-134-137 systemd[1]: Starting Kubernetes Kubelet... Apr 22 21:09:02.896053 ip-10-0-134-137 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:09:02.896053 ip-10-0-134-137 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 21:09:02.896053 ip-10-0-134-137 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:09:02.896053 ip-10-0-134-137 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 21:09:02.896053 ip-10-0-134-137 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 21:09:02.897444 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.897338 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 21:09:02.899610 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899596 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:02.899610 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899610 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899614 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899617 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899620 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899623 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899626 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899629 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899631 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899634 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899637 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899642 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899645 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899648 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899651 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899654 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899657 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899660 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899662 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899665 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:02.899671 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899668 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899670 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899673 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899676 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899679 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899682 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899685 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899689 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899692 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899694 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899697 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899700 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899702 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899705 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899708 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899717 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899721 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899725 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899727 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:02.900115 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899730 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899733 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899735 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899737 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899740 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899742 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899745 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899748 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899750 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899753 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899756 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899758 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899761 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899763 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899766 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899769 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899772 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899775 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899777 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899780 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:02.900581 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899784 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899786 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899789 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899792 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899795 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899798 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899801 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899803 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899806 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899808 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899811 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899813 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899816 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899818 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899820 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899823 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899826 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899828 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899831 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899833 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:02.901062 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899836 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899838 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899841 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899843 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899846 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899848 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.899851 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901247 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901255 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901258 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901261 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901265 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901267 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901270 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901273 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901275 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901279 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901284 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901287 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:02.901558 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901290 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901293 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901295 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901298 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901300 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901303 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901306 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901308 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901311 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901313 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901316 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901318 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901321 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901323 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901326 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901328 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901331 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901334 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901336 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901339 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:02.902011 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901344 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901346 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901349 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901352 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901354 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901357 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901360 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901362 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901365 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901368 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901370 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901373 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901376 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901378 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901381 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901383 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901386 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901389 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901391 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901394 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:02.902522 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901396 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901399 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901401 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901404 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901421 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901426 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901429 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901434 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901441 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901445 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901448 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901451 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901454 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901457 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901460 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901463 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901466 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901468 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901471 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:02.903007 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901474 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901476 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901479 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901482 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901484 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901487 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901491 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901494 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901496 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901499 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901501 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901503 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901506 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901509 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.901511 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.901989 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.901998 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902004 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902009 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902014 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902017 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 21:09:02.903576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902022 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902026 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902030 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902033 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902036 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902039 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902042 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902046 2566 flags.go:64] FLAG: --cgroup-root="" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902049 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902051 2566 flags.go:64] FLAG: --client-ca-file="" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902054 2566 flags.go:64] FLAG: --cloud-config="" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902057 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902060 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902065 2566 flags.go:64] FLAG: --cluster-domain="" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902068 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902071 2566 flags.go:64] FLAG: --config-dir="" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902073 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902078 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902082 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902085 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902088 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902091 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902095 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902098 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 21:09:02.904085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902101 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902104 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902108 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902112 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902115 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902118 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902121 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902124 2566 flags.go:64] FLAG: --enable-server="true" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902127 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902131 2566 flags.go:64] FLAG: --event-burst="100" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902134 2566 flags.go:64] FLAG: --event-qps="50" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902137 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902140 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902143 2566 flags.go:64] FLAG: --eviction-hard="" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902148 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902150 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902154 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902156 2566 flags.go:64] FLAG: --eviction-soft="" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902159 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902162 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902165 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902168 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902171 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902174 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902176 2566 flags.go:64] FLAG: --feature-gates="" Apr 22 21:09:02.904669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902180 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902183 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902187 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902190 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902193 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902197 2566 flags.go:64] FLAG: --help="false" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902199 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-134-137.ec2.internal" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902202 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902205 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902208 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902212 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902215 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902218 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902221 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902224 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902227 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902230 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902233 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902235 2566 flags.go:64] FLAG: --kube-reserved="" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902238 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902241 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902244 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902247 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902250 2566 flags.go:64] FLAG: --lock-file="" Apr 22 21:09:02.905291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902252 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902255 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902258 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902263 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902266 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902269 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902272 2566 flags.go:64] FLAG: --logging-format="text" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902274 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902277 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902280 2566 flags.go:64] FLAG: --manifest-url="" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902283 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902288 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902291 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902295 2566 flags.go:64] FLAG: --max-pods="110" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902298 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902301 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902304 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902307 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902310 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902313 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902316 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902324 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902327 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902330 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 21:09:02.905887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902333 2566 flags.go:64] FLAG: --pod-cidr="" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902336 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902341 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902344 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902348 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902350 2566 flags.go:64] FLAG: --port="10250" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902353 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902356 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07a16380e4d872333" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902359 2566 flags.go:64] FLAG: --qos-reserved="" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902362 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902365 2566 flags.go:64] FLAG: --register-node="true" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902368 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902371 2566 flags.go:64] FLAG: --register-with-taints="" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902375 2566 flags.go:64] FLAG: --registry-burst="10" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902378 2566 flags.go:64] FLAG: --registry-qps="5" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902380 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902383 2566 flags.go:64] FLAG: --reserved-memory="" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902387 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902390 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902393 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902396 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902399 2566 flags.go:64] FLAG: --runonce="false" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902402 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902405 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902420 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 22 21:09:02.906504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902423 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902426 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902429 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902432 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902436 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902439 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902441 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902444 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902447 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902450 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902453 2566 flags.go:64] FLAG: --system-cgroups="" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902456 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902462 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902464 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902468 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902473 2566 flags.go:64] FLAG: --tls-min-version="" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902475 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902478 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902481 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902484 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902487 2566 flags.go:64] FLAG: --v="2" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902491 2566 flags.go:64] FLAG: --version="false" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902495 2566 flags.go:64] FLAG: --vmodule="" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902506 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 21:09:02.907107 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.902509 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902608 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902613 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902617 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902620 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902623 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902626 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902628 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902631 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902633 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902636 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902639 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902642 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902644 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902647 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902649 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902652 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902654 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902657 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:02.907700 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902659 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902662 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902664 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902667 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902669 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902672 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902674 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902677 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902679 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902682 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902684 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902686 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902689 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902692 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902694 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902697 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902700 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902707 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902710 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902712 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:02.908182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902715 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902717 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902722 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902724 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902727 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902729 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902732 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902734 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902737 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902739 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902742 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902744 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902746 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902749 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902752 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902754 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902757 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902759 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902762 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902764 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:02.908705 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902767 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902769 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902772 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902774 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902777 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902779 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902782 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902784 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902787 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902791 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902794 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902797 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902799 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902802 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902806 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902808 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902811 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902814 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902816 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:02.909197 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902821 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:02.909831 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902824 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:02.909831 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902827 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:02.909831 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902830 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:02.909831 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902832 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:02.909831 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902835 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:02.909831 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902837 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:02.909831 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902840 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:02.909831 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.902843 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:02.909831 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.903427 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:09:02.911800 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.911782 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 21:09:02.911800 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.911800 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 21:09:02.911870 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911850 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:02.911870 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911855 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:02.911870 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911859 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:02.911870 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911862 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:02.911870 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911865 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:02.911870 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911868 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:02.911870 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911872 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911875 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911878 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911881 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911883 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911886 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911889 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911892 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911894 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911897 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911901 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911906 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911909 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911912 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911916 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911919 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911922 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911924 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911927 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:02.912042 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911930 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911932 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911935 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911938 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911940 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911944 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911948 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911951 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911954 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911956 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911959 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911961 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911964 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911966 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911969 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911972 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911975 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911978 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911982 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:02.912534 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911985 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911987 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911990 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911993 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911995 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.911998 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912001 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912003 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912006 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912009 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912011 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912014 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912017 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912020 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912023 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912025 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912028 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912030 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912033 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912036 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:02.913002 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912039 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912041 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912044 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912047 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912049 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912052 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912055 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912058 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912061 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912064 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912067 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912069 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912072 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912075 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912078 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912080 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912083 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912085 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912088 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912090 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:02.913531 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912093 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912096 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.912101 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912198 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912204 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912208 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912212 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912214 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912217 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912220 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912223 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912225 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912228 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912230 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912233 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 21:09:02.914020 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912235 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912238 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912241 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912244 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912246 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912249 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912252 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912255 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912258 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912261 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912264 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912267 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912269 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912272 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912274 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912277 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912280 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912282 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912285 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912288 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 21:09:02.914396 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912290 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912293 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912295 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912298 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912300 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912303 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912305 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912308 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912310 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912313 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912315 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912318 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912320 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912323 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912325 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912328 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912331 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912334 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912336 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912339 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 21:09:02.914897 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912341 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912344 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912346 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912349 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912352 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912354 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912358 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912361 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912365 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912368 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912370 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912373 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912376 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912378 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912381 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912383 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912386 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912388 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912391 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912393 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 21:09:02.915384 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912396 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912398 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912401 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912403 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912420 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912425 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912428 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912431 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912434 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912437 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912440 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912442 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912445 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:02.912448 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.912453 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 21:09:02.915957 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.913026 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 21:09:02.916322 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.916296 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 21:09:02.917069 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.917058 2566 server.go:1019] "Starting client certificate rotation" Apr 22 21:09:02.917174 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.917158 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 21:09:02.917208 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.917200 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 21:09:02.937778 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.937762 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 21:09:02.940095 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.940064 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 21:09:02.951714 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.951698 2566 log.go:25] "Validated CRI v1 runtime API" Apr 22 21:09:02.957267 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.957253 2566 log.go:25] "Validated CRI v1 image API" Apr 22 21:09:02.959338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.959320 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 21:09:02.961501 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.961482 2566 fs.go:135] Filesystem UUIDs: map[657645ba-2240-4742-b31c-257ce53724a9:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f7f4da9b-1479-4b6f-aabc-520fe0269ce0:/dev/nvme0n1p4] Apr 22 21:09:02.961570 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.961500 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 21:09:02.967109 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.967090 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 21:09:02.969537 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.969426 2566 manager.go:217] Machine: {Timestamp:2026-04-22 21:09:02.967453812 +0000 UTC m=+0.324268555 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3106956 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec209076ef3809fc1a69615c09bf7e22 SystemUUID:ec209076-ef38-09fc-1a69-615c09bf7e22 BootID:ec1650c4-a025-4910-9c9c-379c66589c3c Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:07:73:63:29:37 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:07:73:63:29:37 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:df:84:55:93:96 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 21:09:02.969537 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.969532 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 21:09:02.969654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.969615 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 21:09:02.971102 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.971077 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 21:09:02.971237 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.971104 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-137.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 21:09:02.971283 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.971246 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 21:09:02.971283 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.971255 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 21:09:02.971283 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.971267 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 21:09:02.971995 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.971984 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 21:09:02.972685 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.972675 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 22 21:09:02.972806 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.972797 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 21:09:02.974664 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.974654 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 22 21:09:02.974701 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.974668 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 21:09:02.974701 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.974680 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 21:09:02.974701 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.974691 2566 kubelet.go:397] "Adding apiserver pod source" Apr 22 21:09:02.974701 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.974699 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 21:09:02.975625 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.975613 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 21:09:02.975683 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.975631 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 21:09:02.978073 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.978057 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 21:09:02.979369 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.979356 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 21:09:02.980525 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980512 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 21:09:02.980567 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980535 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 21:09:02.980567 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980544 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 21:09:02.980567 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980552 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 21:09:02.980567 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980558 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 21:09:02.980567 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980563 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 21:09:02.980567 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980569 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 21:09:02.980725 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980574 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 21:09:02.980725 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980582 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 21:09:02.980725 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980588 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 21:09:02.980725 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980603 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 21:09:02.980725 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.980613 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 21:09:02.981316 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.981307 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 21:09:02.981348 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.981317 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 21:09:02.984937 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.984924 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 21:09:02.984987 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.984957 2566 server.go:1295] "Started kubelet" Apr 22 21:09:02.985064 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.985035 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 21:09:02.986015 ip-10-0-134-137 systemd[1]: Started Kubernetes Kubelet. Apr 22 21:09:02.986134 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.985962 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 21:09:02.986772 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.986630 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 21:09:02.987388 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.987375 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 22 21:09:02.987855 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.987820 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 21:09:02.988886 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:02.988860 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 21:09:02.989003 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:02.988857 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-137.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 21:09:02.989003 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.988912 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-137.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 21:09:02.992734 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.992716 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 21:09:02.992734 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.992728 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 21:09:02.993236 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:02.993212 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 21:09:02.993588 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:02.993554 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-137.ec2.internal\" not found" Apr 22 21:09:02.993663 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.993623 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 21:09:02.993786 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.993757 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 21:09:02.993786 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.993774 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 21:09:02.994066 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.993879 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 22 21:09:02.994066 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.994060 2566 factory.go:153] Registering CRI-O factory Apr 22 21:09:02.994066 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.994066 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 22 21:09:02.994214 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.994113 2566 factory.go:223] Registration of the crio container factory successfully Apr 22 21:09:02.996111 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.996091 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 21:09:02.996198 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.996112 2566 factory.go:55] Registering systemd factory Apr 22 21:09:02.996198 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.996124 2566 factory.go:223] Registration of the systemd container factory successfully Apr 22 21:09:02.996198 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.996178 2566 factory.go:103] Registering Raw factory Apr 22 21:09:02.996198 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.996193 2566 manager.go:1196] Started watching for new ooms in manager Apr 22 21:09:02.996378 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:02.993937 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-137.ec2.internal.18a8c9fd5accbbfa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-137.ec2.internal,UID:ip-10-0-134-137.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-137.ec2.internal,},FirstTimestamp:2026-04-22 21:09:02.984936442 +0000 UTC m=+0.341751185,LastTimestamp:2026-04-22 21:09:02.984936442 +0000 UTC m=+0.341751185,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-137.ec2.internal,}" Apr 22 21:09:02.996691 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:02.996678 2566 manager.go:319] Starting recovery of all containers Apr 22 21:09:03.000250 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.000209 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-137.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 21:09:03.000331 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.000276 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 21:09:03.003138 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.003098 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 21:09:03.006579 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.006547 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6knvj" Apr 22 21:09:03.008876 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.008863 2566 manager.go:324] Recovery completed Apr 22 21:09:03.012785 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.012774 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:03.013519 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.013503 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6knvj" Apr 22 21:09:03.015368 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.015355 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:03.015434 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.015380 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:03.015434 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.015390 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:03.015825 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.015812 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 21:09:03.015865 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.015824 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 21:09:03.015865 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.015842 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 22 21:09:03.017328 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.017264 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-137.ec2.internal.18a8c9fd5c9d16ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-137.ec2.internal,UID:ip-10-0-134-137.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-134-137.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-134-137.ec2.internal,},FirstTimestamp:2026-04-22 21:09:03.015368447 +0000 UTC m=+0.372183190,LastTimestamp:2026-04-22 21:09:03.015368447 +0000 UTC m=+0.372183190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-137.ec2.internal,}" Apr 22 21:09:03.017928 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.017915 2566 policy_none.go:49] "None policy: Start" Apr 22 21:09:03.017979 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.017932 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 21:09:03.017979 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.017942 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 22 21:09:03.054183 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.054168 2566 manager.go:341] "Starting Device Plugin manager" Apr 22 21:09:03.065818 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.054206 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 21:09:03.065818 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.054216 2566 server.go:85] "Starting device plugin registration server" Apr 22 21:09:03.065818 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.054488 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 21:09:03.065818 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.054498 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 21:09:03.065818 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.054615 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 21:09:03.065818 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.054680 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 21:09:03.065818 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.054688 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 21:09:03.065818 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.055272 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 21:09:03.065818 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.055314 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-137.ec2.internal\" not found" Apr 22 21:09:03.127746 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.127714 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 21:09:03.127746 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.127746 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 21:09:03.127887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.127762 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 21:09:03.127887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.127769 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 21:09:03.127887 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.127800 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 21:09:03.130011 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.129996 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:03.154859 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.154816 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:03.155564 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.155551 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:03.155606 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.155578 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:03.155606 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.155589 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:03.155671 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.155610 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.163132 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.163118 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.163175 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.163138 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-137.ec2.internal\": node \"ip-10-0-134-137.ec2.internal\" not found" Apr 22 21:09:03.178953 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.178932 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-137.ec2.internal\" not found" Apr 22 21:09:03.228063 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.228038 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal"] Apr 22 21:09:03.228142 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.228108 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:03.229027 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.229005 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:03.229129 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.229034 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:03.229129 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.229043 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:03.230090 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.230077 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:03.230243 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.230227 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.230302 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.230263 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:03.230823 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.230807 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:03.230892 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.230835 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:03.230892 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.230848 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:03.230892 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.230886 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:03.230992 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.230908 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:03.230992 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.230921 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:03.231804 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.231792 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.231845 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.231821 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 21:09:03.232458 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.232442 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientMemory" Apr 22 21:09:03.232551 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.232466 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 21:09:03.232551 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.232476 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeHasSufficientPID" Apr 22 21:09:03.256953 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.256935 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-137.ec2.internal\" not found" node="ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.261261 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.261246 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-137.ec2.internal\" not found" node="ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.279641 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.279625 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-137.ec2.internal\" not found" Apr 22 21:09:03.296737 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.296719 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a5c58d625b2bd1b4be43292bd5f1c38-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal\" (UID: \"8a5c58d625b2bd1b4be43292bd5f1c38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.296794 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.296745 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c6c453189c3f15e838fbba5937c30555-config\") pod \"kube-apiserver-proxy-ip-10-0-134-137.ec2.internal\" (UID: \"c6c453189c3f15e838fbba5937c30555\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.296794 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.296762 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a5c58d625b2bd1b4be43292bd5f1c38-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal\" (UID: \"8a5c58d625b2bd1b4be43292bd5f1c38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.380650 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.380621 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-137.ec2.internal\" not found" Apr 22 21:09:03.396896 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.396879 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c6c453189c3f15e838fbba5937c30555-config\") pod \"kube-apiserver-proxy-ip-10-0-134-137.ec2.internal\" (UID: \"c6c453189c3f15e838fbba5937c30555\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.396950 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.396901 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a5c58d625b2bd1b4be43292bd5f1c38-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal\" (UID: \"8a5c58d625b2bd1b4be43292bd5f1c38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.396950 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.396919 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a5c58d625b2bd1b4be43292bd5f1c38-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal\" (UID: \"8a5c58d625b2bd1b4be43292bd5f1c38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.396950 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.396944 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a5c58d625b2bd1b4be43292bd5f1c38-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal\" (UID: \"8a5c58d625b2bd1b4be43292bd5f1c38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.397041 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.396946 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/8a5c58d625b2bd1b4be43292bd5f1c38-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal\" (UID: \"8a5c58d625b2bd1b4be43292bd5f1c38\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.397041 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.396967 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c6c453189c3f15e838fbba5937c30555-config\") pod \"kube-apiserver-proxy-ip-10-0-134-137.ec2.internal\" (UID: \"c6c453189c3f15e838fbba5937c30555\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.481316 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.481261 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-137.ec2.internal\" not found" Apr 22 21:09:03.559852 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.559825 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.563485 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.563467 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.582308 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.582281 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-137.ec2.internal\" not found" Apr 22 21:09:03.682893 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.682857 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-137.ec2.internal\" not found" Apr 22 21:09:03.783474 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.783380 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-137.ec2.internal\" not found" Apr 22 21:09:03.823512 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.823491 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:03.893315 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.893290 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.903154 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.903136 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 21:09:03.904488 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.904474 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" Apr 22 21:09:03.914345 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.914328 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 21:09:03.917471 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.917453 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 21:09:03.917568 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.917554 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 21:09:03.975721 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.975692 2566 apiserver.go:52] "Watching apiserver" Apr 22 21:09:03.982211 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.982187 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 21:09:03.983197 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.983175 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-fcxqm","kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal","openshift-cluster-node-tuning-operator/tuned-v4hv7","openshift-dns/node-resolver-t4hx2","openshift-image-registry/node-ca-jq6nq","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal","openshift-multus/multus-additional-cni-plugins-w54p8","openshift-multus/network-metrics-daemon-qvqz5","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6","openshift-multus/multus-f9bbq","openshift-network-diagnostics/network-check-target-b4vrr","openshift-network-operator/iptables-alerter-q2xw2","openshift-ovn-kubernetes/ovnkube-node-wqvs8"] Apr 22 21:09:03.986123 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.986098 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:03.987360 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.987344 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:03.987896 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.987878 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 21:09:03.987896 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.987886 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 21:09:03.988071 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.988054 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8dtqk\"" Apr 22 21:09:03.988349 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.988335 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:03.988469 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.988452 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:03.988942 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.988925 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:03.989003 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.988971 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:03.989041 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.989003 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-blwbc\"" Apr 22 21:09:03.989795 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.989775 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:03.989973 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.989872 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qrsr9\"" Apr 22 21:09:03.990207 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.990185 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 21:09:03.990296 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.990195 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 21:09:03.990438 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.990423 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 21:09:03.990507 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.990430 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xjz2h\"" Apr 22 21:09:03.990507 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.990430 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 21:09:03.990623 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.990558 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 21:09:03.990893 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.990878 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:03.991000 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.990979 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:03.991243 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.991228 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wtxv4\"" Apr 22 21:09:03.991354 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.991342 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 21:09:03.991519 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.991508 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 21:09:03.991742 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.991726 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 21:09:03.991800 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.991755 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 21:09:03.991848 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.991813 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 21:09:03.992171 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.992157 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:03.992802 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.992787 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 21:09:03.993244 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.993229 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f9bbq" Apr 22 21:09:03.993640 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.993598 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 21:09:03.993743 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.993663 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 21:09:03.993743 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.993716 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 21:09:03.993896 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.993880 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-k884x\"" Apr 22 21:09:03.995047 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.995027 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 21:09:03.995266 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.995249 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-29lzd\"" Apr 22 21:09:03.997604 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.997582 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:03.997965 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:03.997945 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:03.999135 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:03.999115 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.000169 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000152 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.000627 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000592 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-var-lib-kubelet\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.000732 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000637 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5eca370a-4734-46fc-810c-17827b9aa727-tmp\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.000732 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000654 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgxf8\" (UniqueName: \"kubernetes.io/projected/83539234-7602-4c9d-a9c8-05dca158b65b-kube-api-access-tgxf8\") pod \"node-resolver-t4hx2\" (UID: \"83539234-7602-4c9d-a9c8-05dca158b65b\") " pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:04.000732 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000703 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-system-cni-dir\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.000887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000744 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 21:09:04.000887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000741 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fbd1aca7-dece-43f0-b914-b6993f56f39a-serviceca\") pod \"node-ca-jq6nq\" (UID: \"fbd1aca7-dece-43f0-b914-b6993f56f39a\") " pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:04.000887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000791 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbgxm\" (UniqueName: \"kubernetes.io/projected/01c1be80-c7fc-433f-bf11-a97af5540866-kube-api-access-cbgxm\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:04.000887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000820 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvv8h\" (UniqueName: \"kubernetes.io/projected/ba523d03-d12e-4172-89eb-9885c3215d06-kube-api-access-gvv8h\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.000887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000831 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 21:09:04.000887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000844 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-multus-conf-dir\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.000887 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000871 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-sysconfig\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000900 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-os-release\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000908 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000918 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9-konnectivity-ca\") pod \"konnectivity-agent-fcxqm\" (UID: \"8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9\") " pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000908 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lcfnh\"" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.000957 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-modprobe-d\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001006 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5eca370a-4734-46fc-810c-17827b9aa727-etc-tuned\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001026 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-cnibin\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001043 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001072 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-registration-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001087 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-device-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001101 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-run\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001119 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-var-lib-cni-bin\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001132 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-run-multus-certs\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001157 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d86eb1d3-4e06-4952-905d-3bc13ae4849b-cni-binary-copy\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.001162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001177 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-867xc\" (UniqueName: \"kubernetes.io/projected/fbd1aca7-dece-43f0-b914-b6993f56f39a-kube-api-access-867xc\") pod \"node-ca-jq6nq\" (UID: \"fbd1aca7-dece-43f0-b914-b6993f56f39a\") " pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001192 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-etc-kubernetes\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001213 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9-agent-certs\") pod \"konnectivity-agent-fcxqm\" (UID: \"8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9\") " pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001228 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-sys-fs\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001251 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba523d03-d12e-4172-89eb-9885c3215d06-cni-binary-copy\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001276 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-multus-socket-dir-parent\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001298 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-hostroot\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001314 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-systemd\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001328 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001347 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-multus-cni-dir\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001369 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba523d03-d12e-4172-89eb-9885c3215d06-multus-daemon-config\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001383 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-lib-modules\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001397 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nqts\" (UniqueName: \"kubernetes.io/projected/5eca370a-4734-46fc-810c-17827b9aa727-kube-api-access-4nqts\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001451 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83539234-7602-4c9d-a9c8-05dca158b65b-hosts-file\") pod \"node-resolver-t4hx2\" (UID: \"83539234-7602-4c9d-a9c8-05dca158b65b\") " pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001477 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d86eb1d3-4e06-4952-905d-3bc13ae4849b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001497 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b\") pod \"network-check-target-b4vrr\" (UID: \"f1c2f55b-0450-43f7-b50e-57b4c3d15108\") " pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:04.001750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001511 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-run-k8s-cni-cncf-io\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001532 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-var-lib-cni-multus\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001552 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-sysctl-conf\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001581 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlcpl\" (UniqueName: \"kubernetes.io/projected/d86eb1d3-4e06-4952-905d-3bc13ae4849b-kube-api-access-zlcpl\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001603 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-run-netns\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001619 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-host\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001640 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-os-release\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001654 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-kubernetes\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001668 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83539234-7602-4c9d-a9c8-05dca158b65b-tmp-dir\") pod \"node-resolver-t4hx2\" (UID: \"83539234-7602-4c9d-a9c8-05dca158b65b\") " pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001682 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001703 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbd1aca7-dece-43f0-b914-b6993f56f39a-host\") pod \"node-ca-jq6nq\" (UID: \"fbd1aca7-dece-43f0-b914-b6993f56f39a\") " pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001724 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-socket-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001739 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-etc-selinux\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001751 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-system-cni-dir\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001765 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-sys\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001779 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-var-lib-kubelet\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.002535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001791 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d86eb1d3-4e06-4952-905d-3bc13ae4849b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.003384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001806 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg5rc\" (UniqueName: \"kubernetes.io/projected/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-kube-api-access-mg5rc\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.003384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001819 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-cnibin\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.003384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.001855 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-sysctl-d\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.003384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.002049 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 21:09:04.003384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.002061 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 21:09:04.003384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.002086 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 21:09:04.003384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.002208 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 21:09:04.003384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.002219 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 21:09:04.003384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.002434 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-hnktj\"" Apr 22 21:09:04.003384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.002463 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 21:09:04.006562 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.006544 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 21:09:04.015005 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.014966 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 21:04:03 +0000 UTC" deadline="2028-02-04 15:32:50.627093716 +0000 UTC" Apr 22 21:09:04.015005 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.015004 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15666h23m46.612094506s" Apr 22 21:09:04.018965 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.018945 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:04.023024 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.023006 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-69t9r" Apr 22 21:09:04.030286 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.030265 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-69t9r" Apr 22 21:09:04.094112 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.094092 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 21:09:04.102094 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102067 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-run-k8s-cni-cncf-io\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.102209 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102107 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-var-lib-cni-multus\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.102209 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102133 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-sysctl-conf\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.102209 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102150 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-run-k8s-cni-cncf-io\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.102209 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102157 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlcpl\" (UniqueName: \"kubernetes.io/projected/d86eb1d3-4e06-4952-905d-3bc13ae4849b-kube-api-access-zlcpl\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102216 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-cni-bin\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-run-netns\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102273 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-host\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102290 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-var-lib-cni-multus\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102301 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-etc-openvswitch\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102326 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-run-openvswitch\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102350 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-os-release\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102377 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-kubernetes\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102401 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83539234-7602-4c9d-a9c8-05dca158b65b-tmp-dir\") pod \"node-resolver-t4hx2\" (UID: \"83539234-7602-4c9d-a9c8-05dca158b65b\") " pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102468 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102497 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbd1aca7-dece-43f0-b914-b6993f56f39a-host\") pod \"node-ca-jq6nq\" (UID: \"fbd1aca7-dece-43f0-b914-b6993f56f39a\") " pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102504 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-sysctl-conf\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.102544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102526 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-var-lib-openvswitch\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102558 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102617 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fbd1aca7-dece-43f0-b914-b6993f56f39a-host\") pod \"node-ca-jq6nq\" (UID: \"fbd1aca7-dece-43f0-b914-b6993f56f39a\") " pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102627 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-os-release\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102672 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmkn6\" (UniqueName: \"kubernetes.io/projected/aee77461-dec9-4db2-99dc-b345c2c300bb-kube-api-access-zmkn6\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102682 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-kubernetes\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.102700 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102722 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-socket-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102730 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-host\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102762 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-run-netns\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.102800 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs podName:01c1be80-c7fc-433f-bf11-a97af5540866 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:04.602747588 +0000 UTC m=+1.959562334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs") pod "network-metrics-daemon-qvqz5" (UID: "01c1be80-c7fc-433f-bf11-a97af5540866") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102818 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-etc-selinux\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102845 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-system-cni-dir\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102856 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-socket-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102871 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-sys\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102897 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-node-log\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102914 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83539234-7602-4c9d-a9c8-05dca158b65b-tmp-dir\") pod \"node-resolver-t4hx2\" (UID: \"83539234-7602-4c9d-a9c8-05dca158b65b\") " pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:04.103187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102923 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e994ba74-d681-4003-b112-a3851276ee9b-iptables-alerter-script\") pod \"iptables-alerter-q2xw2\" (UID: \"e994ba74-d681-4003-b112-a3851276ee9b\") " pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102949 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-var-lib-kubelet\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102921 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-etc-selinux\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102971 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-system-cni-dir\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.102978 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aee77461-dec9-4db2-99dc-b345c2c300bb-ovn-node-metrics-cert\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103017 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d86eb1d3-4e06-4952-905d-3bc13ae4849b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103027 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-sys\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103022 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-var-lib-kubelet\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103083 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103124 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5rc\" (UniqueName: \"kubernetes.io/projected/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-kube-api-access-mg5rc\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103149 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-cnibin\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103173 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-sysctl-d\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103199 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/aee77461-dec9-4db2-99dc-b345c2c300bb-ovnkube-config\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103225 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-var-lib-kubelet\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103227 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-cnibin\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103260 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5eca370a-4734-46fc-810c-17827b9aa727-tmp\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103288 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgxf8\" (UniqueName: \"kubernetes.io/projected/83539234-7602-4c9d-a9c8-05dca158b65b-kube-api-access-tgxf8\") pod \"node-resolver-t4hx2\" (UID: \"83539234-7602-4c9d-a9c8-05dca158b65b\") " pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:04.104034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103309 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-var-lib-kubelet\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103314 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-system-cni-dir\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103346 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-system-cni-dir\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103342 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-sysctl-d\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103361 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fbd1aca7-dece-43f0-b914-b6993f56f39a-serviceca\") pod \"node-ca-jq6nq\" (UID: \"fbd1aca7-dece-43f0-b914-b6993f56f39a\") " pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103435 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-cni-netd\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103458 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e994ba74-d681-4003-b112-a3851276ee9b-host-slash\") pod \"iptables-alerter-q2xw2\" (UID: \"e994ba74-d681-4003-b112-a3851276ee9b\") " pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103500 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbgxm\" (UniqueName: \"kubernetes.io/projected/01c1be80-c7fc-433f-bf11-a97af5540866-kube-api-access-cbgxm\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103527 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvv8h\" (UniqueName: \"kubernetes.io/projected/ba523d03-d12e-4172-89eb-9885c3215d06-kube-api-access-gvv8h\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103553 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/aee77461-dec9-4db2-99dc-b345c2c300bb-env-overrides\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103580 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-multus-conf-dir\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103613 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d86eb1d3-4e06-4952-905d-3bc13ae4849b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103603 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103627 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-multus-conf-dir\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103709 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-sysconfig\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103743 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-os-release\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103797 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-sysconfig\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.104810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103804 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-os-release\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103831 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/aee77461-dec9-4db2-99dc-b345c2c300bb-ovnkube-script-lib\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103865 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9-konnectivity-ca\") pod \"konnectivity-agent-fcxqm\" (UID: \"8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9\") " pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103879 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fbd1aca7-dece-43f0-b914-b6993f56f39a-serviceca\") pod \"node-ca-jq6nq\" (UID: \"fbd1aca7-dece-43f0-b914-b6993f56f39a\") " pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103902 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-modprobe-d\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103919 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5eca370a-4734-46fc-810c-17827b9aa727-etc-tuned\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.103989 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-cnibin\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104018 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-systemd-units\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104024 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-modprobe-d\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104041 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-slash\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104070 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-cnibin\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104080 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-run-netns\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104104 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-run-ovn\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104137 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104164 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-registration-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104189 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-device-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104252 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.105654 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104311 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-device-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104345 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9-konnectivity-ca\") pod \"konnectivity-agent-fcxqm\" (UID: \"8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9\") " pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104362 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-registration-dir\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104391 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-run\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104444 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-run-systemd\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104460 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-run\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104473 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw2hr\" (UniqueName: \"kubernetes.io/projected/e994ba74-d681-4003-b112-a3851276ee9b-kube-api-access-hw2hr\") pod \"iptables-alerter-q2xw2\" (UID: \"e994ba74-d681-4003-b112-a3851276ee9b\") " pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104511 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-var-lib-cni-bin\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104539 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-run-multus-certs\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104564 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d86eb1d3-4e06-4952-905d-3bc13ae4849b-cni-binary-copy\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104588 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-var-lib-cni-bin\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104607 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-host-run-multus-certs\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104589 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-867xc\" (UniqueName: \"kubernetes.io/projected/fbd1aca7-dece-43f0-b914-b6993f56f39a-kube-api-access-867xc\") pod \"node-ca-jq6nq\" (UID: \"fbd1aca7-dece-43f0-b914-b6993f56f39a\") " pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104662 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-kubelet\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104690 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-log-socket\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104718 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-etc-kubernetes\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104758 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9-agent-certs\") pod \"konnectivity-agent-fcxqm\" (UID: \"8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9\") " pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:04.106536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104768 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-etc-kubernetes\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104804 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-sys-fs\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104830 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba523d03-d12e-4172-89eb-9885c3215d06-cni-binary-copy\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104857 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-multus-socket-dir-parent\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104882 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-hostroot\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104904 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-systemd\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104926 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104941 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-multus-socket-dir-parent\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104833 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-sys-fs\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104957 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-multus-cni-dir\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105007 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-multus-cni-dir\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105052 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ba523d03-d12e-4172-89eb-9885c3215d06-hostroot\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.104993 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba523d03-d12e-4172-89eb-9885c3215d06-multus-daemon-config\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105055 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d86eb1d3-4e06-4952-905d-3bc13ae4849b-cni-binary-copy\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105149 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-lib-modules\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105178 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nqts\" (UniqueName: \"kubernetes.io/projected/5eca370a-4734-46fc-810c-17827b9aa727-kube-api-access-4nqts\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105191 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-etc-systemd\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105203 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83539234-7602-4c9d-a9c8-05dca158b65b-hosts-file\") pod \"node-resolver-t4hx2\" (UID: \"83539234-7602-4c9d-a9c8-05dca158b65b\") " pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:04.107164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105228 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d86eb1d3-4e06-4952-905d-3bc13ae4849b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.107673 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105262 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b\") pod \"network-check-target-b4vrr\" (UID: \"f1c2f55b-0450-43f7-b50e-57b4c3d15108\") " pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:04.107673 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105302 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d86eb1d3-4e06-4952-905d-3bc13ae4849b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.107673 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105538 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5eca370a-4734-46fc-810c-17827b9aa727-lib-modules\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.107673 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105620 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/83539234-7602-4c9d-a9c8-05dca158b65b-hosts-file\") pod \"node-resolver-t4hx2\" (UID: \"83539234-7602-4c9d-a9c8-05dca158b65b\") " pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:04.107673 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105925 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ba523d03-d12e-4172-89eb-9885c3215d06-multus-daemon-config\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107673 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.105974 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d86eb1d3-4e06-4952-905d-3bc13ae4849b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.107673 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.106250 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ba523d03-d12e-4172-89eb-9885c3215d06-cni-binary-copy\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.107673 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.106537 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5eca370a-4734-46fc-810c-17827b9aa727-etc-tuned\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.107673 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.106583 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5eca370a-4734-46fc-810c-17827b9aa727-tmp\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.107673 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.107250 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9-agent-certs\") pod \"konnectivity-agent-fcxqm\" (UID: \"8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9\") " pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:04.114646 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.114113 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlcpl\" (UniqueName: \"kubernetes.io/projected/d86eb1d3-4e06-4952-905d-3bc13ae4849b-kube-api-access-zlcpl\") pod \"multus-additional-cni-plugins-w54p8\" (UID: \"d86eb1d3-4e06-4952-905d-3bc13ae4849b\") " pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.114646 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.114497 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:04.114646 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.114518 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:04.114646 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.114532 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jkg9b for pod openshift-network-diagnostics/network-check-target-b4vrr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:04.114646 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.114598 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b podName:f1c2f55b-0450-43f7-b50e-57b4c3d15108 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:04.614569125 +0000 UTC m=+1.971383871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jkg9b" (UniqueName: "kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b") pod "network-check-target-b4vrr" (UID: "f1c2f55b-0450-43f7-b50e-57b4c3d15108") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:04.115029 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.115006 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nqts\" (UniqueName: \"kubernetes.io/projected/5eca370a-4734-46fc-810c-17827b9aa727-kube-api-access-4nqts\") pod \"tuned-v4hv7\" (UID: \"5eca370a-4734-46fc-810c-17827b9aa727\") " pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.115261 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.115241 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvv8h\" (UniqueName: \"kubernetes.io/projected/ba523d03-d12e-4172-89eb-9885c3215d06-kube-api-access-gvv8h\") pod \"multus-f9bbq\" (UID: \"ba523d03-d12e-4172-89eb-9885c3215d06\") " pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.115875 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.115841 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-867xc\" (UniqueName: \"kubernetes.io/projected/fbd1aca7-dece-43f0-b914-b6993f56f39a-kube-api-access-867xc\") pod \"node-ca-jq6nq\" (UID: \"fbd1aca7-dece-43f0-b914-b6993f56f39a\") " pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:04.116121 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.116098 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg5rc\" (UniqueName: \"kubernetes.io/projected/be7fc1c2-37d7-43b8-9b3e-949d5ff50946-kube-api-access-mg5rc\") pod \"aws-ebs-csi-driver-node-mq7k6\" (UID: \"be7fc1c2-37d7-43b8-9b3e-949d5ff50946\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.116642 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.116623 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgxf8\" (UniqueName: \"kubernetes.io/projected/83539234-7602-4c9d-a9c8-05dca158b65b-kube-api-access-tgxf8\") pod \"node-resolver-t4hx2\" (UID: \"83539234-7602-4c9d-a9c8-05dca158b65b\") " pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:04.116956 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.116939 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbgxm\" (UniqueName: \"kubernetes.io/projected/01c1be80-c7fc-433f-bf11-a97af5540866-kube-api-access-cbgxm\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:04.129668 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:04.129642 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6c453189c3f15e838fbba5937c30555.slice/crio-9d7558e417258923ade635bfc5adc2297655947a89c2651121b7d799326c65f4 WatchSource:0}: Error finding container 9d7558e417258923ade635bfc5adc2297655947a89c2651121b7d799326c65f4: Status 404 returned error can't find the container with id 9d7558e417258923ade635bfc5adc2297655947a89c2651121b7d799326c65f4 Apr 22 21:09:04.129975 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:04.129961 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a5c58d625b2bd1b4be43292bd5f1c38.slice/crio-00bb4180f12826b24da80d0f11861f2615a3f2f3dcc6b64e5f82c48bdd0eb46a WatchSource:0}: Error finding container 00bb4180f12826b24da80d0f11861f2615a3f2f3dcc6b64e5f82c48bdd0eb46a: Status 404 returned error can't find the container with id 00bb4180f12826b24da80d0f11861f2615a3f2f3dcc6b64e5f82c48bdd0eb46a Apr 22 21:09:04.134267 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.134249 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:09:04.205553 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205516 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-run-systemd\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.205553 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205554 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw2hr\" (UniqueName: \"kubernetes.io/projected/e994ba74-d681-4003-b112-a3851276ee9b-kube-api-access-hw2hr\") pod \"iptables-alerter-q2xw2\" (UID: \"e994ba74-d681-4003-b112-a3851276ee9b\") " pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.205727 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205572 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-kubelet\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.205727 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205588 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-log-socket\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.205727 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205626 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-run-systemd\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.205727 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205644 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-kubelet\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.205727 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205646 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-log-socket\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.205727 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205717 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-cni-bin\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205737 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-etc-openvswitch\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205752 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-run-openvswitch\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205781 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-var-lib-openvswitch\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205799 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205829 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-run-ovn-kubernetes\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205829 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-var-lib-openvswitch\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205829 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-run-openvswitch\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205839 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmkn6\" (UniqueName: \"kubernetes.io/projected/aee77461-dec9-4db2-99dc-b345c2c300bb-kube-api-access-zmkn6\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205846 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-etc-openvswitch\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205874 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-cni-bin\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205874 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-node-log\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205903 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e994ba74-d681-4003-b112-a3851276ee9b-iptables-alerter-script\") pod \"iptables-alerter-q2xw2\" (UID: \"e994ba74-d681-4003-b112-a3851276ee9b\") " pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205912 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-node-log\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205919 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aee77461-dec9-4db2-99dc-b345c2c300bb-ovn-node-metrics-cert\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205939 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205964 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/aee77461-dec9-4db2-99dc-b345c2c300bb-ovnkube-config\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206002 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.205990 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-cni-netd\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206013 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e994ba74-d681-4003-b112-a3851276ee9b-host-slash\") pod \"iptables-alerter-q2xw2\" (UID: \"e994ba74-d681-4003-b112-a3851276ee9b\") " pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206010 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206054 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-cni-netd\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206062 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/aee77461-dec9-4db2-99dc-b345c2c300bb-env-overrides\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206093 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/aee77461-dec9-4db2-99dc-b345c2c300bb-ovnkube-script-lib\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206120 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-systemd-units\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206095 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e994ba74-d681-4003-b112-a3851276ee9b-host-slash\") pod \"iptables-alerter-q2xw2\" (UID: \"e994ba74-d681-4003-b112-a3851276ee9b\") " pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206145 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-slash\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206171 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-run-netns\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206194 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-run-ovn\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206262 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-run-ovn\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206304 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-systemd-units\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206344 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-run-netns\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206353 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/aee77461-dec9-4db2-99dc-b345c2c300bb-host-slash\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206503 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e994ba74-d681-4003-b112-a3851276ee9b-iptables-alerter-script\") pod \"iptables-alerter-q2xw2\" (UID: \"e994ba74-d681-4003-b112-a3851276ee9b\") " pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206517 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/aee77461-dec9-4db2-99dc-b345c2c300bb-env-overrides\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.206780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.206524 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/aee77461-dec9-4db2-99dc-b345c2c300bb-ovnkube-config\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.207614 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.207150 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/aee77461-dec9-4db2-99dc-b345c2c300bb-ovnkube-script-lib\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.208348 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.208324 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/aee77461-dec9-4db2-99dc-b345c2c300bb-ovn-node-metrics-cert\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.217090 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.217062 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmkn6\" (UniqueName: \"kubernetes.io/projected/aee77461-dec9-4db2-99dc-b345c2c300bb-kube-api-access-zmkn6\") pod \"ovnkube-node-wqvs8\" (UID: \"aee77461-dec9-4db2-99dc-b345c2c300bb\") " pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.217173 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.217102 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw2hr\" (UniqueName: \"kubernetes.io/projected/e994ba74-d681-4003-b112-a3851276ee9b-kube-api-access-hw2hr\") pod \"iptables-alerter-q2xw2\" (UID: \"e994ba74-d681-4003-b112-a3851276ee9b\") " pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.311381 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.311321 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:04.317135 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:04.317112 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c9bd2f4_6d8d_4016_b3c4_d2b398b5abc9.slice/crio-85a03f038cd34b04071c058db1d23fa87f7f817f47726aee3207bb912c3b7a1e WatchSource:0}: Error finding container 85a03f038cd34b04071c058db1d23fa87f7f817f47726aee3207bb912c3b7a1e: Status 404 returned error can't find the container with id 85a03f038cd34b04071c058db1d23fa87f7f817f47726aee3207bb912c3b7a1e Apr 22 21:09:04.333502 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.333484 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" Apr 22 21:09:04.338720 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:04.338698 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eca370a_4734_46fc_810c_17827b9aa727.slice/crio-7e4733973d8ea06af59f2ce46c0a47cb8a0c94833f7b5bc193718aa8439808bd WatchSource:0}: Error finding container 7e4733973d8ea06af59f2ce46c0a47cb8a0c94833f7b5bc193718aa8439808bd: Status 404 returned error can't find the container with id 7e4733973d8ea06af59f2ce46c0a47cb8a0c94833f7b5bc193718aa8439808bd Apr 22 21:09:04.342536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.342521 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-t4hx2" Apr 22 21:09:04.348138 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:04.348117 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83539234_7602_4c9d_a9c8_05dca158b65b.slice/crio-650e967a7c8f9bdc8e104fd22e891659adc1120f747f65c3ccbfdc10534181b3 WatchSource:0}: Error finding container 650e967a7c8f9bdc8e104fd22e891659adc1120f747f65c3ccbfdc10534181b3: Status 404 returned error can't find the container with id 650e967a7c8f9bdc8e104fd22e891659adc1120f747f65c3ccbfdc10534181b3 Apr 22 21:09:04.348375 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.348360 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:04.367297 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.367150 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jq6nq" Apr 22 21:09:04.372208 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:04.372182 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd1aca7_dece_43f0_b914_b6993f56f39a.slice/crio-d05d70c63d4879d0ef651b5cc8a214e92feb91d2df142e2504abdab0e0ef9515 WatchSource:0}: Error finding container d05d70c63d4879d0ef651b5cc8a214e92feb91d2df142e2504abdab0e0ef9515: Status 404 returned error can't find the container with id d05d70c63d4879d0ef651b5cc8a214e92feb91d2df142e2504abdab0e0ef9515 Apr 22 21:09:04.377816 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.377797 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-w54p8" Apr 22 21:09:04.383179 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:04.383157 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd86eb1d3_4e06_4952_905d_3bc13ae4849b.slice/crio-334a4f57d3555744de2bfcac36df8936cfe9fab6544d6296e315e0359b13cfef WatchSource:0}: Error finding container 334a4f57d3555744de2bfcac36df8936cfe9fab6544d6296e315e0359b13cfef: Status 404 returned error can't find the container with id 334a4f57d3555744de2bfcac36df8936cfe9fab6544d6296e315e0359b13cfef Apr 22 21:09:04.393952 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.393936 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" Apr 22 21:09:04.400092 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.399587 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f9bbq" Apr 22 21:09:04.405721 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:04.405700 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba523d03_d12e_4172_89eb_9885c3215d06.slice/crio-ed398b5401f1422e0afffa3141a4873daf081128fd241f01081cc78c1f9f2987 WatchSource:0}: Error finding container ed398b5401f1422e0afffa3141a4873daf081128fd241f01081cc78c1f9f2987: Status 404 returned error can't find the container with id ed398b5401f1422e0afffa3141a4873daf081128fd241f01081cc78c1f9f2987 Apr 22 21:09:04.416202 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.416184 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-q2xw2" Apr 22 21:09:04.422198 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:04.422178 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode994ba74_d681_4003_b112_a3851276ee9b.slice/crio-2a1b4a4f0c9276addb64032db68e568f136a6ef3fba1b3e8218d579ac5105d37 WatchSource:0}: Error finding container 2a1b4a4f0c9276addb64032db68e568f136a6ef3fba1b3e8218d579ac5105d37: Status 404 returned error can't find the container with id 2a1b4a4f0c9276addb64032db68e568f136a6ef3fba1b3e8218d579ac5105d37 Apr 22 21:09:04.431218 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.431202 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:04.436465 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:04.436444 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee77461_dec9_4db2_99dc_b345c2c300bb.slice/crio-56c781ddd33ff83e51ad0d42e8bcf737804cfb2ccfbce1d0adadb1384d9a29e9 WatchSource:0}: Error finding container 56c781ddd33ff83e51ad0d42e8bcf737804cfb2ccfbce1d0adadb1384d9a29e9: Status 404 returned error can't find the container with id 56c781ddd33ff83e51ad0d42e8bcf737804cfb2ccfbce1d0adadb1384d9a29e9 Apr 22 21:09:04.610215 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.610127 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:04.610371 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.610294 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:04.610371 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.610367 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs podName:01c1be80-c7fc-433f-bf11-a97af5540866 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:05.610349536 +0000 UTC m=+2.967164281 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs") pod "network-metrics-daemon-qvqz5" (UID: "01c1be80-c7fc-433f-bf11-a97af5540866") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:04.711051 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:04.711021 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b\") pod \"network-check-target-b4vrr\" (UID: \"f1c2f55b-0450-43f7-b50e-57b4c3d15108\") " pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:04.711212 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.711176 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:04.711212 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.711195 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:04.711212 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.711208 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jkg9b for pod openshift-network-diagnostics/network-check-target-b4vrr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:04.711370 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:04.711260 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b podName:f1c2f55b-0450-43f7-b50e-57b4c3d15108 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:05.711243128 +0000 UTC m=+3.068057863 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkg9b" (UniqueName: "kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b") pod "network-check-target-b4vrr" (UID: "f1c2f55b-0450-43f7-b50e-57b4c3d15108") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:05.031622 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.031545 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 21:04:04 +0000 UTC" deadline="2027-11-20 17:09:33.831112837 +0000 UTC" Apr 22 21:09:05.031622 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.031582 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13844h0m28.799534199s" Apr 22 21:09:05.143855 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.143362 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:05.143855 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:05.143497 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:05.163867 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.163813 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal" event={"ID":"c6c453189c3f15e838fbba5937c30555","Type":"ContainerStarted","Data":"9d7558e417258923ade635bfc5adc2297655947a89c2651121b7d799326c65f4"} Apr 22 21:09:05.199000 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.198969 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" event={"ID":"8a5c58d625b2bd1b4be43292bd5f1c38","Type":"ContainerStarted","Data":"00bb4180f12826b24da80d0f11861f2615a3f2f3dcc6b64e5f82c48bdd0eb46a"} Apr 22 21:09:05.219581 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.219550 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" event={"ID":"aee77461-dec9-4db2-99dc-b345c2c300bb","Type":"ContainerStarted","Data":"56c781ddd33ff83e51ad0d42e8bcf737804cfb2ccfbce1d0adadb1384d9a29e9"} Apr 22 21:09:05.226786 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.226615 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-q2xw2" event={"ID":"e994ba74-d681-4003-b112-a3851276ee9b","Type":"ContainerStarted","Data":"2a1b4a4f0c9276addb64032db68e568f136a6ef3fba1b3e8218d579ac5105d37"} Apr 22 21:09:05.238231 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.238206 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f9bbq" event={"ID":"ba523d03-d12e-4172-89eb-9885c3215d06","Type":"ContainerStarted","Data":"ed398b5401f1422e0afffa3141a4873daf081128fd241f01081cc78c1f9f2987"} Apr 22 21:09:05.248305 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.248015 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" event={"ID":"be7fc1c2-37d7-43b8-9b3e-949d5ff50946","Type":"ContainerStarted","Data":"fe971812bc97a822b647bc5e2bb1b93af1c20f5f9dd6153ba34fa126e3a8238e"} Apr 22 21:09:05.252773 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.252748 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w54p8" event={"ID":"d86eb1d3-4e06-4952-905d-3bc13ae4849b","Type":"ContainerStarted","Data":"334a4f57d3555744de2bfcac36df8936cfe9fab6544d6296e315e0359b13cfef"} Apr 22 21:09:05.271212 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.271186 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" event={"ID":"5eca370a-4734-46fc-810c-17827b9aa727","Type":"ContainerStarted","Data":"7e4733973d8ea06af59f2ce46c0a47cb8a0c94833f7b5bc193718aa8439808bd"} Apr 22 21:09:05.303285 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.303183 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jq6nq" event={"ID":"fbd1aca7-dece-43f0-b914-b6993f56f39a","Type":"ContainerStarted","Data":"d05d70c63d4879d0ef651b5cc8a214e92feb91d2df142e2504abdab0e0ef9515"} Apr 22 21:09:05.324981 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.324950 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t4hx2" event={"ID":"83539234-7602-4c9d-a9c8-05dca158b65b","Type":"ContainerStarted","Data":"650e967a7c8f9bdc8e104fd22e891659adc1120f747f65c3ccbfdc10534181b3"} Apr 22 21:09:05.331822 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.331799 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fcxqm" event={"ID":"8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9","Type":"ContainerStarted","Data":"85a03f038cd34b04071c058db1d23fa87f7f817f47726aee3207bb912c3b7a1e"} Apr 22 21:09:05.402834 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.402804 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 21:09:05.620340 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.620220 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:05.620513 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:05.620402 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:05.620513 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:05.620489 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs podName:01c1be80-c7fc-433f-bf11-a97af5540866 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:07.620469876 +0000 UTC m=+4.977284612 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs") pod "network-metrics-daemon-qvqz5" (UID: "01c1be80-c7fc-433f-bf11-a97af5540866") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:05.720944 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:05.720625 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b\") pod \"network-check-target-b4vrr\" (UID: \"f1c2f55b-0450-43f7-b50e-57b4c3d15108\") " pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:05.720944 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:05.720833 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:05.720944 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:05.720854 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:05.720944 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:05.720867 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jkg9b for pod openshift-network-diagnostics/network-check-target-b4vrr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:05.720944 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:05.720923 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b podName:f1c2f55b-0450-43f7-b50e-57b4c3d15108 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:07.720906203 +0000 UTC m=+5.077720935 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkg9b" (UniqueName: "kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b") pod "network-check-target-b4vrr" (UID: "f1c2f55b-0450-43f7-b50e-57b4c3d15108") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:06.032204 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:06.032081 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 21:04:04 +0000 UTC" deadline="2028-01-31 21:49:10.187856578 +0000 UTC" Apr 22 21:09:06.032204 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:06.032119 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15576h40m4.155741376s" Apr 22 21:09:06.128193 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:06.128049 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:06.128367 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:06.128206 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:07.136667 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:07.136635 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:07.137118 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:07.136757 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:07.636374 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:07.636290 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:07.636552 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:07.636457 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:07.636552 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:07.636528 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs podName:01c1be80-c7fc-433f-bf11-a97af5540866 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:11.636506707 +0000 UTC m=+8.993321440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs") pod "network-metrics-daemon-qvqz5" (UID: "01c1be80-c7fc-433f-bf11-a97af5540866") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:07.737587 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:07.736945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b\") pod \"network-check-target-b4vrr\" (UID: \"f1c2f55b-0450-43f7-b50e-57b4c3d15108\") " pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:07.737587 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:07.737151 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:07.737587 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:07.737172 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:07.737587 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:07.737185 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jkg9b for pod openshift-network-diagnostics/network-check-target-b4vrr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:07.737587 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:07.737243 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b podName:f1c2f55b-0450-43f7-b50e-57b4c3d15108 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:11.737225216 +0000 UTC m=+9.094039959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkg9b" (UniqueName: "kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b") pod "network-check-target-b4vrr" (UID: "f1c2f55b-0450-43f7-b50e-57b4c3d15108") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:08.128816 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:08.128743 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:08.128973 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:08.128883 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:09.128866 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:09.128833 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:09.129327 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:09.128965 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:10.128615 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:10.128581 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:10.128782 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:10.128760 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:11.128330 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:11.128294 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:11.128794 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:11.128436 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:11.668043 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:11.667867 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:11.668043 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:11.668027 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:11.668278 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:11.668090 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs podName:01c1be80-c7fc-433f-bf11-a97af5540866 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:19.668073107 +0000 UTC m=+17.024887838 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs") pod "network-metrics-daemon-qvqz5" (UID: "01c1be80-c7fc-433f-bf11-a97af5540866") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:11.769205 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:11.769156 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b\") pod \"network-check-target-b4vrr\" (UID: \"f1c2f55b-0450-43f7-b50e-57b4c3d15108\") " pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:11.769372 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:11.769312 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:11.769372 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:11.769336 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:11.769372 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:11.769349 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jkg9b for pod openshift-network-diagnostics/network-check-target-b4vrr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:11.769551 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:11.769404 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b podName:f1c2f55b-0450-43f7-b50e-57b4c3d15108 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:19.769386598 +0000 UTC m=+17.126201334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkg9b" (UniqueName: "kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b") pod "network-check-target-b4vrr" (UID: "f1c2f55b-0450-43f7-b50e-57b4c3d15108") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:12.128643 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:12.128564 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:12.129074 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:12.128700 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:13.132476 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:13.130274 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:13.132476 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:13.130457 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:14.001163 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.001135 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gsqgw"] Apr 22 21:09:14.002861 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.002842 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:14.002960 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:14.002932 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:14.087602 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.087562 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/94c92ddb-384e-4003-80b5-1b032afdc994-kubelet-config\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:14.087762 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.087649 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/94c92ddb-384e-4003-80b5-1b032afdc994-dbus\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:14.087762 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.087684 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:14.128593 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.128565 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:14.128741 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:14.128721 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:14.188199 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.188164 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/94c92ddb-384e-4003-80b5-1b032afdc994-kubelet-config\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:14.188688 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.188207 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/94c92ddb-384e-4003-80b5-1b032afdc994-dbus\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:14.188688 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.188260 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:14.188688 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.188319 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/94c92ddb-384e-4003-80b5-1b032afdc994-kubelet-config\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:14.188688 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:14.188398 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:14.188688 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:14.188468 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret podName:94c92ddb-384e-4003-80b5-1b032afdc994 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:14.688449245 +0000 UTC m=+12.045263977 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret") pod "global-pull-secret-syncer-gsqgw" (UID: "94c92ddb-384e-4003-80b5-1b032afdc994") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:14.188688 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.188488 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/94c92ddb-384e-4003-80b5-1b032afdc994-dbus\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:14.692420 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:14.692385 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:14.692597 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:14.692527 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:14.692671 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:14.692608 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret podName:94c92ddb-384e-4003-80b5-1b032afdc994 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:15.692588482 +0000 UTC m=+13.049403218 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret") pod "global-pull-secret-syncer-gsqgw" (UID: "94c92ddb-384e-4003-80b5-1b032afdc994") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:15.129046 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:15.129009 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:15.129226 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:15.129124 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:15.700114 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:15.700075 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:15.700592 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:15.700214 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:15.700592 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:15.700292 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret podName:94c92ddb-384e-4003-80b5-1b032afdc994 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:17.700271566 +0000 UTC m=+15.057086299 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret") pod "global-pull-secret-syncer-gsqgw" (UID: "94c92ddb-384e-4003-80b5-1b032afdc994") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:16.128530 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:16.128493 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:16.128683 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:16.128625 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:16.128744 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:16.128685 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:16.128847 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:16.128819 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:17.128714 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:17.128684 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:17.129160 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:17.128810 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:17.713602 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:17.713563 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:17.713779 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:17.713722 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:17.713821 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:17.713803 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret podName:94c92ddb-384e-4003-80b5-1b032afdc994 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:21.713782763 +0000 UTC m=+19.070597500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret") pod "global-pull-secret-syncer-gsqgw" (UID: "94c92ddb-384e-4003-80b5-1b032afdc994") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:18.128267 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:18.128237 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:18.128461 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:18.128243 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:18.128461 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:18.128338 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:18.128570 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:18.128462 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:19.128213 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:19.128179 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:19.128650 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:19.128307 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:19.729249 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:19.729211 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:19.729403 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:19.729375 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:19.729481 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:19.729455 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs podName:01c1be80-c7fc-433f-bf11-a97af5540866 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:35.729435295 +0000 UTC m=+33.086250027 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs") pod "network-metrics-daemon-qvqz5" (UID: "01c1be80-c7fc-433f-bf11-a97af5540866") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:19.829842 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:19.829802 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b\") pod \"network-check-target-b4vrr\" (UID: \"f1c2f55b-0450-43f7-b50e-57b4c3d15108\") " pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:19.830028 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:19.829940 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:19.830028 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:19.829958 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:19.830028 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:19.829969 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jkg9b for pod openshift-network-diagnostics/network-check-target-b4vrr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:19.830028 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:19.830023 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b podName:f1c2f55b-0450-43f7-b50e-57b4c3d15108 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:35.830004493 +0000 UTC m=+33.186819224 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkg9b" (UniqueName: "kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b") pod "network-check-target-b4vrr" (UID: "f1c2f55b-0450-43f7-b50e-57b4c3d15108") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:20.128601 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:20.128567 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:20.129033 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:20.128567 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:20.129033 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:20.128719 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:20.129033 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:20.128754 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:21.128854 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:21.128825 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:21.129287 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:21.128948 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:21.744492 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:21.744452 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:21.744663 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:21.744582 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:21.744663 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:21.744643 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret podName:94c92ddb-384e-4003-80b5-1b032afdc994 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:29.744628044 +0000 UTC m=+27.101442774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret") pod "global-pull-secret-syncer-gsqgw" (UID: "94c92ddb-384e-4003-80b5-1b032afdc994") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:22.128075 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:22.128041 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:22.128231 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:22.128153 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:22.128306 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:22.128228 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:22.128357 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:22.128312 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:23.129915 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.129690 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:23.130464 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:23.130010 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:23.367277 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.367246 2566 generic.go:358] "Generic (PLEG): container finished" podID="8a5c58d625b2bd1b4be43292bd5f1c38" containerID="a52a05a18517220e53b8bd0e20e829b3334c2ef46e6fea5e9a454d6326a62f23" exitCode=0 Apr 22 21:09:23.367431 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.367315 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" event={"ID":"8a5c58d625b2bd1b4be43292bd5f1c38","Type":"ContainerDied","Data":"a52a05a18517220e53b8bd0e20e829b3334c2ef46e6fea5e9a454d6326a62f23"} Apr 22 21:09:23.369645 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.369626 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:09:23.369929 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.369911 2566 generic.go:358] "Generic (PLEG): container finished" podID="aee77461-dec9-4db2-99dc-b345c2c300bb" containerID="1c9d1730b499fd6ecbfba49ed21328ac143d4aaeea0f0ea32a29bdb3293f865a" exitCode=1 Apr 22 21:09:23.369996 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.369969 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" event={"ID":"aee77461-dec9-4db2-99dc-b345c2c300bb","Type":"ContainerStarted","Data":"3e5d3cc4d047cec9f8dc8ef4c64300db55dfb66c88e1bbb473ac8dea0fbfdff8"} Apr 22 21:09:23.369996 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.369985 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" event={"ID":"aee77461-dec9-4db2-99dc-b345c2c300bb","Type":"ContainerStarted","Data":"64c5f005e8048a61a9b5c2265c323941a803a60213e4d270b9246b4ec734b09b"} Apr 22 21:09:23.370072 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.369995 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" event={"ID":"aee77461-dec9-4db2-99dc-b345c2c300bb","Type":"ContainerStarted","Data":"146ad4ebb8badeb306feebe25cf90f7747b61a5eebbac0ca8d1b7d9b62e6894f"} Apr 22 21:09:23.370072 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.370008 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" event={"ID":"aee77461-dec9-4db2-99dc-b345c2c300bb","Type":"ContainerStarted","Data":"c2f6f6037cdefbffb542ddd5fde5ecb2861fc4b22533fb652e80005eea06fef7"} Apr 22 21:09:23.370072 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.370017 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" event={"ID":"aee77461-dec9-4db2-99dc-b345c2c300bb","Type":"ContainerDied","Data":"1c9d1730b499fd6ecbfba49ed21328ac143d4aaeea0f0ea32a29bdb3293f865a"} Apr 22 21:09:23.370072 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.370031 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" event={"ID":"aee77461-dec9-4db2-99dc-b345c2c300bb","Type":"ContainerStarted","Data":"bc639bfa1415237f7d0ca4eba53d0ec13a849a6d73e7d10583ab6502fa9b206e"} Apr 22 21:09:23.371190 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.371170 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f9bbq" event={"ID":"ba523d03-d12e-4172-89eb-9885c3215d06","Type":"ContainerStarted","Data":"15ca5642e08115fa8c781f6467e5d83865ac87b0fde5111e5fb2bc143fad2b9c"} Apr 22 21:09:23.372456 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.372433 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" event={"ID":"be7fc1c2-37d7-43b8-9b3e-949d5ff50946","Type":"ContainerStarted","Data":"fc5f3bb921cc9fdf3d912c4a17e9d4db07404fa5bcf826137408b0527f479e27"} Apr 22 21:09:23.373510 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.373489 2566 generic.go:358] "Generic (PLEG): container finished" podID="d86eb1d3-4e06-4952-905d-3bc13ae4849b" containerID="b750daf42d41af90c153382c38eb3203a18970349f2d939d7de9027ae4909972" exitCode=0 Apr 22 21:09:23.373602 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.373557 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w54p8" event={"ID":"d86eb1d3-4e06-4952-905d-3bc13ae4849b","Type":"ContainerDied","Data":"b750daf42d41af90c153382c38eb3203a18970349f2d939d7de9027ae4909972"} Apr 22 21:09:23.374783 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.374763 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" event={"ID":"5eca370a-4734-46fc-810c-17827b9aa727","Type":"ContainerStarted","Data":"f11abae5d3dc6550c253d68fb85228110744e09ad1d116984f5d6ac298ac7160"} Apr 22 21:09:23.375958 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.375934 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jq6nq" event={"ID":"fbd1aca7-dece-43f0-b914-b6993f56f39a","Type":"ContainerStarted","Data":"57c0a21255313c60a4cb14765bdde317dfd09965d79b43a4a00cf373a86d252a"} Apr 22 21:09:23.377180 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.377157 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-t4hx2" event={"ID":"83539234-7602-4c9d-a9c8-05dca158b65b","Type":"ContainerStarted","Data":"94d1d5d7d2b68ee36dd756ecc20038a8b41c9f4b3b4f73c49693f8250a1efcef"} Apr 22 21:09:23.378496 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.378479 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fcxqm" event={"ID":"8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9","Type":"ContainerStarted","Data":"e71886fa073d355e3ed7ce4c0ca32cfe2725c72830cf7ec549077a8b61f6a5cc"} Apr 22 21:09:23.381904 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.381883 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal" event={"ID":"c6c453189c3f15e838fbba5937c30555","Type":"ContainerStarted","Data":"c35817f2cd64934039c592c68cd43894c3afd670a4f69a6d7dda173a215df730"} Apr 22 21:09:23.389539 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.389489 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-v4hv7" podStartSLOduration=2.44225951 podStartE2EDuration="20.389477821s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:09:04.340167604 +0000 UTC m=+1.696982338" lastFinishedPulling="2026-04-22 21:09:22.287385908 +0000 UTC m=+19.644200649" observedRunningTime="2026-04-22 21:09:23.389108333 +0000 UTC m=+20.745923087" watchObservedRunningTime="2026-04-22 21:09:23.389477821 +0000 UTC m=+20.746292573" Apr 22 21:09:23.399607 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.399562 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-t4hx2" podStartSLOduration=2.482133065 podStartE2EDuration="20.399551664s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:09:04.349655397 +0000 UTC m=+1.706470130" lastFinishedPulling="2026-04-22 21:09:22.267073991 +0000 UTC m=+19.623888729" observedRunningTime="2026-04-22 21:09:23.398972962 +0000 UTC m=+20.755787736" watchObservedRunningTime="2026-04-22 21:09:23.399551664 +0000 UTC m=+20.756366416" Apr 22 21:09:23.434599 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.432223 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jq6nq" podStartSLOduration=2.490453635 podStartE2EDuration="20.432209561s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:09:04.373452361 +0000 UTC m=+1.730267091" lastFinishedPulling="2026-04-22 21:09:22.315208286 +0000 UTC m=+19.672023017" observedRunningTime="2026-04-22 21:09:23.432036835 +0000 UTC m=+20.788851588" watchObservedRunningTime="2026-04-22 21:09:23.432209561 +0000 UTC m=+20.789024316" Apr 22 21:09:23.445680 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.445648 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f9bbq" podStartSLOduration=2.505295092 podStartE2EDuration="20.445637359s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:09:04.407034035 +0000 UTC m=+1.763848765" lastFinishedPulling="2026-04-22 21:09:22.347376301 +0000 UTC m=+19.704191032" observedRunningTime="2026-04-22 21:09:23.445485818 +0000 UTC m=+20.802300570" watchObservedRunningTime="2026-04-22 21:09:23.445637359 +0000 UTC m=+20.802452111" Apr 22 21:09:23.466494 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.466463 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fcxqm" podStartSLOduration=2.517893913 podStartE2EDuration="20.466453426s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:09:04.318474407 +0000 UTC m=+1.675289136" lastFinishedPulling="2026-04-22 21:09:22.267033919 +0000 UTC m=+19.623848649" observedRunningTime="2026-04-22 21:09:23.466254618 +0000 UTC m=+20.823069372" watchObservedRunningTime="2026-04-22 21:09:23.466453426 +0000 UTC m=+20.823268175" Apr 22 21:09:23.466569 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:23.466520 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-137.ec2.internal" podStartSLOduration=20.466516341 podStartE2EDuration="20.466516341s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:23.455126741 +0000 UTC m=+20.811941493" watchObservedRunningTime="2026-04-22 21:09:23.466516341 +0000 UTC m=+20.823331093" Apr 22 21:09:24.128135 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:24.127989 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:24.128135 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:24.128000 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:24.128135 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:24.128093 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:24.128360 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:24.128188 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:24.133886 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:24.133858 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 21:09:24.385559 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:24.385513 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-q2xw2" event={"ID":"e994ba74-d681-4003-b112-a3851276ee9b","Type":"ContainerStarted","Data":"1571b57fbde71c72dc65efcdf16974f3df3705191eb7bc30aed0cf60fc33d774"} Apr 22 21:09:24.387604 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:24.387571 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" event={"ID":"be7fc1c2-37d7-43b8-9b3e-949d5ff50946","Type":"ContainerStarted","Data":"07088884047b7ae2af15b81e90d564fd71fef6001f5be7ffb24341483a4cc10a"} Apr 22 21:09:24.389463 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:24.389436 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" event={"ID":"8a5c58d625b2bd1b4be43292bd5f1c38","Type":"ContainerStarted","Data":"81d6574405ed65a8c40ceabb7122255f3ed4cb1d197fb24046079b1c0508ec8d"} Apr 22 21:09:24.406498 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:24.406450 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-137.ec2.internal" podStartSLOduration=21.406439025 podStartE2EDuration="21.406439025s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:09:24.40638001 +0000 UTC m=+21.763194762" watchObservedRunningTime="2026-04-22 21:09:24.406439025 +0000 UTC m=+21.763253777" Apr 22 21:09:24.406915 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:24.406889 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-q2xw2" podStartSLOduration=3.563460351 podStartE2EDuration="21.406880582s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:09:04.42354671 +0000 UTC m=+1.780361444" lastFinishedPulling="2026-04-22 21:09:22.266966935 +0000 UTC m=+19.623781675" observedRunningTime="2026-04-22 21:09:24.396137024 +0000 UTC m=+21.752951777" watchObservedRunningTime="2026-04-22 21:09:24.406880582 +0000 UTC m=+21.763695333" Apr 22 21:09:25.065936 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:25.065622 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T21:09:24.133872462Z","UUID":"94a990b5-c975-4c79-9397-1e7913a497a1","Handler":null,"Name":"","Endpoint":""} Apr 22 21:09:25.068383 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:25.068359 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 21:09:25.068535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:25.068393 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 21:09:25.128674 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:25.128638 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:25.128821 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:25.128759 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:25.394580 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:25.394541 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:09:25.395005 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:25.394916 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" event={"ID":"aee77461-dec9-4db2-99dc-b345c2c300bb","Type":"ContainerStarted","Data":"1812b6f27597ad08dbeebad7f7edfc60dc07c007af5db61073be2bbe08af8c33"} Apr 22 21:09:25.396968 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:25.396939 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" event={"ID":"be7fc1c2-37d7-43b8-9b3e-949d5ff50946","Type":"ContainerStarted","Data":"e2450223a4f47cca33e50bbe5b4a4c33dd8538cff9f4d37b0643e0c171258f45"} Apr 22 21:09:25.412193 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:25.412153 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mq7k6" podStartSLOduration=1.844612674 podStartE2EDuration="22.412141559s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:09:04.402917136 +0000 UTC m=+1.759731869" lastFinishedPulling="2026-04-22 21:09:24.97044602 +0000 UTC m=+22.327260754" observedRunningTime="2026-04-22 21:09:25.4118899 +0000 UTC m=+22.768704653" watchObservedRunningTime="2026-04-22 21:09:25.412141559 +0000 UTC m=+22.768956311" Apr 22 21:09:26.128793 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:26.128765 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:26.128975 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:26.128766 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:26.128975 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:26.128886 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:26.129081 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:26.128972 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:27.129018 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:27.128988 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:27.129666 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:27.129101 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:27.253102 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:27.253069 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:27.253722 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:27.253701 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:27.400208 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:27.400136 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:27.400767 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:27.400745 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fcxqm" Apr 22 21:09:28.128724 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:28.128548 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:28.128859 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:28.128554 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:28.128902 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:28.128787 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:28.128902 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:28.128870 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:28.402787 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:28.402704 2566 generic.go:358] "Generic (PLEG): container finished" podID="d86eb1d3-4e06-4952-905d-3bc13ae4849b" containerID="ee93e0b946a174e80d632e19ef01c9b4035bf30b3c724e5123f942ef789a1dfe" exitCode=0 Apr 22 21:09:28.403502 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:28.402792 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w54p8" event={"ID":"d86eb1d3-4e06-4952-905d-3bc13ae4849b","Type":"ContainerDied","Data":"ee93e0b946a174e80d632e19ef01c9b4035bf30b3c724e5123f942ef789a1dfe"} Apr 22 21:09:28.409042 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:28.409016 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:09:28.409419 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:28.409388 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" event={"ID":"aee77461-dec9-4db2-99dc-b345c2c300bb","Type":"ContainerStarted","Data":"ef00f9a56d954df7c7737cfaa940bf074e33a08c1fc354174be8f0ed9b0e0c86"} Apr 22 21:09:28.409863 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:28.409843 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:28.409950 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:28.409936 2566 scope.go:117] "RemoveContainer" containerID="1c9d1730b499fd6ecbfba49ed21328ac143d4aaeea0f0ea32a29bdb3293f865a" Apr 22 21:09:28.425113 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:28.425096 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:29.128581 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.128551 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:29.128706 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:29.128677 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:29.409544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.409306 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gsqgw"] Apr 22 21:09:29.409923 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.409573 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:29.409923 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:29.409674 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:29.414435 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.413051 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qvqz5"] Apr 22 21:09:29.414435 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.413236 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:29.414435 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:29.413377 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:29.415210 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.415187 2566 generic.go:358] "Generic (PLEG): container finished" podID="d86eb1d3-4e06-4952-905d-3bc13ae4849b" containerID="a4a6ee009610378fd2de4468e7d90e4e6f5a868f207f2291beb7a20b28b5b93f" exitCode=0 Apr 22 21:09:29.415319 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.415267 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w54p8" event={"ID":"d86eb1d3-4e06-4952-905d-3bc13ae4849b","Type":"ContainerDied","Data":"a4a6ee009610378fd2de4468e7d90e4e6f5a868f207f2291beb7a20b28b5b93f"} Apr 22 21:09:29.418707 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.418692 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:09:29.419148 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.419107 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" event={"ID":"aee77461-dec9-4db2-99dc-b345c2c300bb","Type":"ContainerStarted","Data":"ac507ed26fe27d737ef9bcc31a2667c30b19f1f91c9ed939cd225c5445faf4ac"} Apr 22 21:09:29.419771 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.419751 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:29.419850 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.419780 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:29.420128 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.420107 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b4vrr"] Apr 22 21:09:29.420257 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.420197 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:29.420320 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:29.420290 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:29.435759 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.435738 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:09:29.455541 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.455458 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" podStartSLOduration=8.356778612 podStartE2EDuration="26.455426197s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:09:04.437908994 +0000 UTC m=+1.794723728" lastFinishedPulling="2026-04-22 21:09:22.536556584 +0000 UTC m=+19.893371313" observedRunningTime="2026-04-22 21:09:29.454809517 +0000 UTC m=+26.811624304" watchObservedRunningTime="2026-04-22 21:09:29.455426197 +0000 UTC m=+26.812240946" Apr 22 21:09:29.805846 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:29.805731 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:29.805846 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:29.805837 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:29.806036 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:29.805901 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret podName:94c92ddb-384e-4003-80b5-1b032afdc994 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:45.805881458 +0000 UTC m=+43.162696192 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret") pod "global-pull-secret-syncer-gsqgw" (UID: "94c92ddb-384e-4003-80b5-1b032afdc994") : object "kube-system"/"original-pull-secret" not registered Apr 22 21:09:30.422944 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:30.422907 2566 generic.go:358] "Generic (PLEG): container finished" podID="d86eb1d3-4e06-4952-905d-3bc13ae4849b" containerID="45128e3fdbcefc33b41d8349ad8ecfbe52878bc7da534b94a0ffe6889fc82ec7" exitCode=0 Apr 22 21:09:30.423286 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:30.422947 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w54p8" event={"ID":"d86eb1d3-4e06-4952-905d-3bc13ae4849b","Type":"ContainerDied","Data":"45128e3fdbcefc33b41d8349ad8ecfbe52878bc7da534b94a0ffe6889fc82ec7"} Apr 22 21:09:31.128207 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:31.128096 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:31.128207 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:31.128097 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:31.128495 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:31.128235 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:31.128495 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:31.128229 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:31.128495 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:31.128333 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:31.128495 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:31.128424 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:33.129901 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:33.129858 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:33.130547 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:33.129955 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:33.130547 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:33.129983 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:33.130547 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:33.130032 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:33.130547 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:33.130100 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:33.130547 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:33.130189 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:35.128858 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.128803 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:35.129321 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.128803 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:35.129321 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.128936 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:09:35.129321 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.129005 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsqgw" podUID="94c92ddb-384e-4003-80b5-1b032afdc994" Apr 22 21:09:35.129321 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.129049 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:35.129321 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.129090 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b4vrr" podUID="f1c2f55b-0450-43f7-b50e-57b4c3d15108" Apr 22 21:09:35.509309 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.509239 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-137.ec2.internal" event="NodeReady" Apr 22 21:09:35.509470 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.509419 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 21:09:35.539691 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.539662 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f"] Apr 22 21:09:35.567810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.567783 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2"] Apr 22 21:09:35.567962 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.567938 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" Apr 22 21:09:35.570246 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.570201 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 21:09:35.570384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.570268 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 21:09:35.570384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.570300 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-fs5c2\"" Apr 22 21:09:35.570384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.570308 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 21:09:35.570687 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.570474 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 21:09:35.582135 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.582115 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-79894b4455-t8gz8"] Apr 22 21:09:35.582274 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.582257 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.584490 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.584460 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 21:09:35.600072 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.600051 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm"] Apr 22 21:09:35.600222 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.600205 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.602812 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.602789 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 21:09:35.603249 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.603211 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 21:09:35.603356 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.603321 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-j9t5d\"" Apr 22 21:09:35.603534 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.603517 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 21:09:35.613266 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.613246 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 21:09:35.622232 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.622211 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f"] Apr 22 21:09:35.622347 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.622239 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2"] Apr 22 21:09:35.622347 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.622254 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm"] Apr 22 21:09:35.622347 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.622264 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79894b4455-t8gz8"] Apr 22 21:09:35.622347 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.622277 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wzzzt"] Apr 22 21:09:35.622347 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.622343 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.624177 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.624156 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 21:09:35.624277 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.624211 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 21:09:35.624277 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.624248 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 21:09:35.624447 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.624404 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 21:09:35.636531 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.636511 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lszg5"] Apr 22 21:09:35.636723 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.636684 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:35.638368 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.638342 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 21:09:35.638480 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.638425 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 21:09:35.638537 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.638499 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 21:09:35.638703 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.638683 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tq52g\"" Apr 22 21:09:35.653797 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.653777 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/381b646f-ad72-498b-b7ba-1b4d21500e65-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f\" (UID: \"381b646f-ad72-498b-b7ba-1b4d21500e65\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" Apr 22 21:09:35.653996 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.653976 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrdrq\" (UniqueName: \"kubernetes.io/projected/381b646f-ad72-498b-b7ba-1b4d21500e65-kube-api-access-xrdrq\") pod \"managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f\" (UID: \"381b646f-ad72-498b-b7ba-1b4d21500e65\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" Apr 22 21:09:35.655160 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.655144 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wzzzt"] Apr 22 21:09:35.655238 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.655164 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lszg5"] Apr 22 21:09:35.655276 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.655260 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.656900 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.656880 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 21:09:35.657140 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.657122 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 21:09:35.657232 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.657189 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8d48v\"" Apr 22 21:09:35.754438 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754393 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.754438 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754437 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.754681 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754473 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djsfm\" (UniqueName: \"kubernetes.io/projected/e942491f-8eab-4373-99e0-3a821262ac0d-kube-api-access-djsfm\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.754681 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-trusted-ca\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.754681 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754568 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1161b36d-e8f6-417a-975a-80f7d1eba5e1-tmp-dir\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.754681 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754600 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrdrq\" (UniqueName: \"kubernetes.io/projected/381b646f-ad72-498b-b7ba-1b4d21500e65-kube-api-access-xrdrq\") pod \"managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f\" (UID: \"381b646f-ad72-498b-b7ba-1b4d21500e65\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" Apr 22 21:09:35.754681 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754621 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/edd98168-b1af-4fed-89c3-c589b37e1931-ca-trust-extracted\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.754681 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754647 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-bound-sa-token\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.754681 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754668 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.755007 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754697 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:35.755007 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754728 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-image-registry-private-configuration\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.755007 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754753 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-registry-certificates\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.755007 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754781 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28h9r\" (UniqueName: \"kubernetes.io/projected/b521d1ec-ac06-4801-88b2-25972cfd8773-kube-api-access-28h9r\") pod \"klusterlet-addon-workmgr-84d9f98565-ggst2\" (UID: \"b521d1ec-ac06-4801-88b2-25972cfd8773\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.755007 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754804 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:35.755007 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.754831 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:35.755007 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.754837 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg2tb\" (UniqueName: \"kubernetes.io/projected/1161b36d-e8f6-417a-975a-80f7d1eba5e1-kube-api-access-xg2tb\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.755007 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.754914 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs podName:01c1be80-c7fc-433f-bf11-a97af5540866 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:07.75489246 +0000 UTC m=+65.111707200 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs") pod "network-metrics-daemon-qvqz5" (UID: "01c1be80-c7fc-433f-bf11-a97af5540866") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755007 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-installation-pull-secrets\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755042 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4hw7\" (UniqueName: \"kubernetes.io/projected/18f0e0a7-a91f-432d-bcb0-5e33fb885077-kube-api-access-x4hw7\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755074 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1161b36d-e8f6-417a-975a-80f7d1eba5e1-config-volume\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755097 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b521d1ec-ac06-4801-88b2-25972cfd8773-klusterlet-config\") pod \"klusterlet-addon-workmgr-84d9f98565-ggst2\" (UID: \"b521d1ec-ac06-4801-88b2-25972cfd8773\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755118 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/e942491f-8eab-4373-99e0-3a821262ac0d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755164 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b521d1ec-ac06-4801-88b2-25972cfd8773-tmp\") pod \"klusterlet-addon-workmgr-84d9f98565-ggst2\" (UID: \"b521d1ec-ac06-4801-88b2-25972cfd8773\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755188 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-ca\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755212 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755249 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mfqb\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-kube-api-access-4mfqb\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755293 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/381b646f-ad72-498b-b7ba-1b4d21500e65-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f\" (UID: \"381b646f-ad72-498b-b7ba-1b4d21500e65\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" Apr 22 21:09:35.755338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.755316 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-hub\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.760081 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.759851 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/381b646f-ad72-498b-b7ba-1b4d21500e65-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f\" (UID: \"381b646f-ad72-498b-b7ba-1b4d21500e65\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" Apr 22 21:09:35.762424 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.762387 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrdrq\" (UniqueName: \"kubernetes.io/projected/381b646f-ad72-498b-b7ba-1b4d21500e65-kube-api-access-xrdrq\") pod \"managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f\" (UID: \"381b646f-ad72-498b-b7ba-1b4d21500e65\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" Apr 22 21:09:35.856618 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856579 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-installation-pull-secrets\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.856804 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856627 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4hw7\" (UniqueName: \"kubernetes.io/projected/18f0e0a7-a91f-432d-bcb0-5e33fb885077-kube-api-access-x4hw7\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:35.856804 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856656 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1161b36d-e8f6-417a-975a-80f7d1eba5e1-config-volume\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.856804 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856680 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b521d1ec-ac06-4801-88b2-25972cfd8773-klusterlet-config\") pod \"klusterlet-addon-workmgr-84d9f98565-ggst2\" (UID: \"b521d1ec-ac06-4801-88b2-25972cfd8773\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.856804 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856696 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/e942491f-8eab-4373-99e0-3a821262ac0d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.856804 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856725 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b521d1ec-ac06-4801-88b2-25972cfd8773-tmp\") pod \"klusterlet-addon-workmgr-84d9f98565-ggst2\" (UID: \"b521d1ec-ac06-4801-88b2-25972cfd8773\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.856804 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856748 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-ca\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.856804 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856773 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.856804 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856801 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mfqb\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-kube-api-access-4mfqb\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856832 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-hub\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856887 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856912 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856944 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djsfm\" (UniqueName: \"kubernetes.io/projected/e942491f-8eab-4373-99e0-3a821262ac0d-kube-api-access-djsfm\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.856991 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b\") pod \"network-check-target-b4vrr\" (UID: \"f1c2f55b-0450-43f7-b50e-57b4c3d15108\") " pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-trusted-ca\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857048 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1161b36d-e8f6-417a-975a-80f7d1eba5e1-tmp-dir\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857079 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/edd98168-b1af-4fed-89c3-c589b37e1931-ca-trust-extracted\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857102 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-bound-sa-token\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857128 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.857182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857181 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-image-registry-private-configuration\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.857730 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857207 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-registry-certificates\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.857730 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857243 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28h9r\" (UniqueName: \"kubernetes.io/projected/b521d1ec-ac06-4801-88b2-25972cfd8773-kube-api-access-28h9r\") pod \"klusterlet-addon-workmgr-84d9f98565-ggst2\" (UID: \"b521d1ec-ac06-4801-88b2-25972cfd8773\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.857730 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857272 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:35.857730 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857295 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xg2tb\" (UniqueName: \"kubernetes.io/projected/1161b36d-e8f6-417a-975a-80f7d1eba5e1-kube-api-access-xg2tb\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.857730 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857316 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1161b36d-e8f6-417a-975a-80f7d1eba5e1-config-volume\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.857730 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.857379 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b521d1ec-ac06-4801-88b2-25972cfd8773-tmp\") pod \"klusterlet-addon-workmgr-84d9f98565-ggst2\" (UID: \"b521d1ec-ac06-4801-88b2-25972cfd8773\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.858108 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.858083 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/e942491f-8eab-4373-99e0-3a821262ac0d-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.858244 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.858219 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-registry-certificates\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.858487 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.858465 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:35.858487 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.858488 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79894b4455-t8gz8: secret "image-registry-tls" not found Apr 22 21:09:35.858689 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.858559 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls podName:edd98168-b1af-4fed-89c3-c589b37e1931 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:36.358525342 +0000 UTC m=+33.715340072 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls") pod "image-registry-79894b4455-t8gz8" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931") : secret "image-registry-tls" not found Apr 22 21:09:35.858689 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.858590 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:35.858689 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.858677 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert podName:18f0e0a7-a91f-432d-bcb0-5e33fb885077 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:36.358658081 +0000 UTC m=+33.715472827 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert") pod "ingress-canary-wzzzt" (UID: "18f0e0a7-a91f-432d-bcb0-5e33fb885077") : secret "canary-serving-cert" not found Apr 22 21:09:35.858967 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.858942 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1161b36d-e8f6-417a-975a-80f7d1eba5e1-tmp-dir\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.859069 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.859052 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 21:09:35.859203 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.859075 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 21:09:35.859203 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.859086 2566 projected.go:194] Error preparing data for projected volume kube-api-access-jkg9b for pod openshift-network-diagnostics/network-check-target-b4vrr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:35.859203 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.859126 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b podName:f1c2f55b-0450-43f7-b50e-57b4c3d15108 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:07.85911034 +0000 UTC m=+65.215925076 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jkg9b" (UniqueName: "kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b") pod "network-check-target-b4vrr" (UID: "f1c2f55b-0450-43f7-b50e-57b4c3d15108") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 21:09:35.860778 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.860178 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-hub\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.860778 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.860466 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/b521d1ec-ac06-4801-88b2-25972cfd8773-klusterlet-config\") pod \"klusterlet-addon-workmgr-84d9f98565-ggst2\" (UID: \"b521d1ec-ac06-4801-88b2-25972cfd8773\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.860778 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.860561 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:35.860778 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.860600 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-trusted-ca\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.860778 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:35.860615 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls podName:1161b36d-e8f6-417a-975a-80f7d1eba5e1 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:36.36059942 +0000 UTC m=+33.717414159 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls") pod "dns-default-lszg5" (UID: "1161b36d-e8f6-417a-975a-80f7d1eba5e1") : secret "dns-default-metrics-tls" not found Apr 22 21:09:35.860778 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.860742 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/edd98168-b1af-4fed-89c3-c589b37e1931-ca-trust-extracted\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.861125 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.861090 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-ca\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.861335 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.861315 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-image-registry-private-configuration\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.861429 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.861392 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-installation-pull-secrets\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.862203 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.862182 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.862289 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.862252 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/e942491f-8eab-4373-99e0-3a821262ac0d-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.867483 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.867460 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4hw7\" (UniqueName: \"kubernetes.io/projected/18f0e0a7-a91f-432d-bcb0-5e33fb885077-kube-api-access-x4hw7\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:35.867618 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.867459 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg2tb\" (UniqueName: \"kubernetes.io/projected/1161b36d-e8f6-417a-975a-80f7d1eba5e1-kube-api-access-xg2tb\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:35.869081 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.869029 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28h9r\" (UniqueName: \"kubernetes.io/projected/b521d1ec-ac06-4801-88b2-25972cfd8773-kube-api-access-28h9r\") pod \"klusterlet-addon-workmgr-84d9f98565-ggst2\" (UID: \"b521d1ec-ac06-4801-88b2-25972cfd8773\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.869565 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.869529 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djsfm\" (UniqueName: \"kubernetes.io/projected/e942491f-8eab-4373-99e0-3a821262ac0d-kube-api-access-djsfm\") pod \"cluster-proxy-proxy-agent-7dbb487bf6-hmzzm\" (UID: \"e942491f-8eab-4373-99e0-3a821262ac0d\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:35.874503 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.874479 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-bound-sa-token\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.874607 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.874581 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mfqb\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-kube-api-access-4mfqb\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:35.888069 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.888048 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" Apr 22 21:09:35.897818 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.897796 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:35.931748 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:35.931719 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:09:36.179044 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:36.179016 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2"] Apr 22 21:09:36.181653 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:36.181624 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f"] Apr 22 21:09:36.188014 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:36.187992 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm"] Apr 22 21:09:36.307107 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:36.307010 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb521d1ec_ac06_4801_88b2_25972cfd8773.slice/crio-ab02e68850c081cb16afb1083fdcd5ea1c48f7df7323e8ab3b970303e7d3416d WatchSource:0}: Error finding container ab02e68850c081cb16afb1083fdcd5ea1c48f7df7323e8ab3b970303e7d3416d: Status 404 returned error can't find the container with id ab02e68850c081cb16afb1083fdcd5ea1c48f7df7323e8ab3b970303e7d3416d Apr 22 21:09:36.307605 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:36.307483 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod381b646f_ad72_498b_b7ba_1b4d21500e65.slice/crio-cf9357b4cc806070f995bb2dad3ba91052d23e013303fed1b1b52c968f6b3e75 WatchSource:0}: Error finding container cf9357b4cc806070f995bb2dad3ba91052d23e013303fed1b1b52c968f6b3e75: Status 404 returned error can't find the container with id cf9357b4cc806070f995bb2dad3ba91052d23e013303fed1b1b52c968f6b3e75 Apr 22 21:09:36.308080 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:36.308059 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode942491f_8eab_4373_99e0_3a821262ac0d.slice/crio-ae15908fee72e2d0aec66c64f9633caafdebb5b5664ea00b7623aae23764bd55 WatchSource:0}: Error finding container ae15908fee72e2d0aec66c64f9633caafdebb5b5664ea00b7623aae23764bd55: Status 404 returned error can't find the container with id ae15908fee72e2d0aec66c64f9633caafdebb5b5664ea00b7623aae23764bd55 Apr 22 21:09:36.363626 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:36.363602 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:36.363711 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:36.363648 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:36.363711 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:36.363683 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:36.363781 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:36.363747 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:36.363781 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:36.363751 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:36.363781 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:36.363774 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79894b4455-t8gz8: secret "image-registry-tls" not found Apr 22 21:09:36.363866 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:36.363784 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls podName:1161b36d-e8f6-417a-975a-80f7d1eba5e1 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:37.363772277 +0000 UTC m=+34.720587007 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls") pod "dns-default-lszg5" (UID: "1161b36d-e8f6-417a-975a-80f7d1eba5e1") : secret "dns-default-metrics-tls" not found Apr 22 21:09:36.363866 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:36.363818 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls podName:edd98168-b1af-4fed-89c3-c589b37e1931 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:37.363803331 +0000 UTC m=+34.720618066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls") pod "image-registry-79894b4455-t8gz8" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931") : secret "image-registry-tls" not found Apr 22 21:09:36.363866 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:36.363749 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:36.363866 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:36.363852 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert podName:18f0e0a7-a91f-432d-bcb0-5e33fb885077 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:37.36384324 +0000 UTC m=+34.720657974 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert") pod "ingress-canary-wzzzt" (UID: "18f0e0a7-a91f-432d-bcb0-5e33fb885077") : secret "canary-serving-cert" not found Apr 22 21:09:36.435278 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:36.435237 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" event={"ID":"381b646f-ad72-498b-b7ba-1b4d21500e65","Type":"ContainerStarted","Data":"cf9357b4cc806070f995bb2dad3ba91052d23e013303fed1b1b52c968f6b3e75"} Apr 22 21:09:36.436556 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:36.436525 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" event={"ID":"e942491f-8eab-4373-99e0-3a821262ac0d","Type":"ContainerStarted","Data":"ae15908fee72e2d0aec66c64f9633caafdebb5b5664ea00b7623aae23764bd55"} Apr 22 21:09:36.437499 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:36.437479 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" event={"ID":"b521d1ec-ac06-4801-88b2-25972cfd8773","Type":"ContainerStarted","Data":"ab02e68850c081cb16afb1083fdcd5ea1c48f7df7323e8ab3b970303e7d3416d"} Apr 22 21:09:37.128245 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.128213 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:37.128882 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.128858 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:09:37.129439 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.129403 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:09:37.130312 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.130289 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 21:09:37.130878 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.130857 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 21:09:37.131123 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.131102 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bw4nx\"" Apr 22 21:09:37.131317 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.131281 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 21:09:37.131586 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.131561 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 21:09:37.131855 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.131834 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gqtcf\"" Apr 22 21:09:37.372738 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.372699 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:37.373170 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.372790 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:37.373170 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.372839 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:37.373170 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:37.372981 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:37.373170 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:37.373137 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:37.373170 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:37.373150 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79894b4455-t8gz8: secret "image-registry-tls" not found Apr 22 21:09:37.373529 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:37.373228 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:37.373529 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:37.373315 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert podName:18f0e0a7-a91f-432d-bcb0-5e33fb885077 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:39.373020342 +0000 UTC m=+36.729835077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert") pod "ingress-canary-wzzzt" (UID: "18f0e0a7-a91f-432d-bcb0-5e33fb885077") : secret "canary-serving-cert" not found Apr 22 21:09:37.373529 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:37.373349 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls podName:edd98168-b1af-4fed-89c3-c589b37e1931 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:39.373333787 +0000 UTC m=+36.730148518 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls") pod "image-registry-79894b4455-t8gz8" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931") : secret "image-registry-tls" not found Apr 22 21:09:37.373529 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:37.373363 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls podName:1161b36d-e8f6-417a-975a-80f7d1eba5e1 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:39.373356481 +0000 UTC m=+36.730171211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls") pod "dns-default-lszg5" (UID: "1161b36d-e8f6-417a-975a-80f7d1eba5e1") : secret "dns-default-metrics-tls" not found Apr 22 21:09:37.454175 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.454098 2566 generic.go:358] "Generic (PLEG): container finished" podID="d86eb1d3-4e06-4952-905d-3bc13ae4849b" containerID="380ac232baafa32fb70d75edb766277eedb82ffcce754babd1787784f0e4651a" exitCode=0 Apr 22 21:09:37.454175 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:37.454165 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w54p8" event={"ID":"d86eb1d3-4e06-4952-905d-3bc13ae4849b","Type":"ContainerDied","Data":"380ac232baafa32fb70d75edb766277eedb82ffcce754babd1787784f0e4651a"} Apr 22 21:09:38.462427 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:38.462179 2566 generic.go:358] "Generic (PLEG): container finished" podID="d86eb1d3-4e06-4952-905d-3bc13ae4849b" containerID="69c495ca64db4325524545fa7d1a550c54814ec494939694d1e2338191af0d5d" exitCode=0 Apr 22 21:09:38.462869 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:38.462458 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w54p8" event={"ID":"d86eb1d3-4e06-4952-905d-3bc13ae4849b","Type":"ContainerDied","Data":"69c495ca64db4325524545fa7d1a550c54814ec494939694d1e2338191af0d5d"} Apr 22 21:09:39.393354 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:39.393315 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:39.393541 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:39.393423 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:39.393541 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:39.393446 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:39.393541 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:39.393484 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:39.393541 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:39.393506 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert podName:18f0e0a7-a91f-432d-bcb0-5e33fb885077 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:43.393491281 +0000 UTC m=+40.750306010 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert") pod "ingress-canary-wzzzt" (UID: "18f0e0a7-a91f-432d-bcb0-5e33fb885077") : secret "canary-serving-cert" not found Apr 22 21:09:39.393765 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:39.393570 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:39.393765 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:39.393618 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls podName:1161b36d-e8f6-417a-975a-80f7d1eba5e1 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:43.393603072 +0000 UTC m=+40.750417817 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls") pod "dns-default-lszg5" (UID: "1161b36d-e8f6-417a-975a-80f7d1eba5e1") : secret "dns-default-metrics-tls" not found Apr 22 21:09:39.393765 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:39.393683 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:39.393765 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:39.393695 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79894b4455-t8gz8: secret "image-registry-tls" not found Apr 22 21:09:39.393765 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:39.393726 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls podName:edd98168-b1af-4fed-89c3-c589b37e1931 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:43.393716131 +0000 UTC m=+40.750530864 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls") pod "image-registry-79894b4455-t8gz8" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931") : secret "image-registry-tls" not found Apr 22 21:09:42.472867 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:42.472784 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" event={"ID":"381b646f-ad72-498b-b7ba-1b4d21500e65","Type":"ContainerStarted","Data":"b5ca2607c1deee2f723757a937bf3b8761970474126152b61838eab804309038"} Apr 22 21:09:42.474186 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:42.474163 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" event={"ID":"e942491f-8eab-4373-99e0-3a821262ac0d","Type":"ContainerStarted","Data":"4ae143275d941421181e980914d374af49f3cd224ea93719b23913bfd500dad0"} Apr 22 21:09:42.477176 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:42.477152 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-w54p8" event={"ID":"d86eb1d3-4e06-4952-905d-3bc13ae4849b","Type":"ContainerStarted","Data":"e211e584c38349b697ce24e02b86b407d087dc8c39ae7499de9840c783c84099"} Apr 22 21:09:42.478367 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:42.478349 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" event={"ID":"b521d1ec-ac06-4801-88b2-25972cfd8773","Type":"ContainerStarted","Data":"2168e5fed3e614b2c469de58a237eabc4ac1dc32b2650577f1792fa8ce6db475"} Apr 22 21:09:42.478584 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:42.478572 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:42.480066 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:42.480046 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:09:42.488257 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:42.488221 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" podStartSLOduration=2.603895458 podStartE2EDuration="8.48820998s" podCreationTimestamp="2026-04-22 21:09:34 +0000 UTC" firstStartedPulling="2026-04-22 21:09:36.324500095 +0000 UTC m=+33.681314844" lastFinishedPulling="2026-04-22 21:09:42.208814629 +0000 UTC m=+39.565629366" observedRunningTime="2026-04-22 21:09:42.487642 +0000 UTC m=+39.844456752" watchObservedRunningTime="2026-04-22 21:09:42.48820998 +0000 UTC m=+39.845024729" Apr 22 21:09:42.506618 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:42.506473 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-w54p8" podStartSLOduration=7.542710056 podStartE2EDuration="39.506458547s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:09:04.384502887 +0000 UTC m=+1.741317620" lastFinishedPulling="2026-04-22 21:09:36.348251378 +0000 UTC m=+33.705066111" observedRunningTime="2026-04-22 21:09:42.505401952 +0000 UTC m=+39.862216701" watchObservedRunningTime="2026-04-22 21:09:42.506458547 +0000 UTC m=+39.863273299" Apr 22 21:09:42.519709 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:42.519669 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" podStartSLOduration=2.613288366 podStartE2EDuration="8.519654605s" podCreationTimestamp="2026-04-22 21:09:34 +0000 UTC" firstStartedPulling="2026-04-22 21:09:36.324429744 +0000 UTC m=+33.681244491" lastFinishedPulling="2026-04-22 21:09:42.230795985 +0000 UTC m=+39.587610730" observedRunningTime="2026-04-22 21:09:42.519047874 +0000 UTC m=+39.875862627" watchObservedRunningTime="2026-04-22 21:09:42.519654605 +0000 UTC m=+39.876469354" Apr 22 21:09:43.425450 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:43.425395 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:43.425645 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:43.425485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:43.425645 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:43.425533 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:43.425645 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:43.425567 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:43.425645 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:43.425620 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:43.425645 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:43.425634 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79894b4455-t8gz8: secret "image-registry-tls" not found Apr 22 21:09:43.425885 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:43.425651 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls podName:1161b36d-e8f6-417a-975a-80f7d1eba5e1 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:51.425627625 +0000 UTC m=+48.782442359 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls") pod "dns-default-lszg5" (UID: "1161b36d-e8f6-417a-975a-80f7d1eba5e1") : secret "dns-default-metrics-tls" not found Apr 22 21:09:43.425885 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:43.425689 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:43.425885 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:43.425694 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls podName:edd98168-b1af-4fed-89c3-c589b37e1931 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:51.425679036 +0000 UTC m=+48.782493766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls") pod "image-registry-79894b4455-t8gz8" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931") : secret "image-registry-tls" not found Apr 22 21:09:43.425885 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:43.425761 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert podName:18f0e0a7-a91f-432d-bcb0-5e33fb885077 nodeName:}" failed. No retries permitted until 2026-04-22 21:09:51.425745354 +0000 UTC m=+48.782560087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert") pod "ingress-canary-wzzzt" (UID: "18f0e0a7-a91f-432d-bcb0-5e33fb885077") : secret "canary-serving-cert" not found Apr 22 21:09:45.485465 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:45.485431 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" event={"ID":"e942491f-8eab-4373-99e0-3a821262ac0d","Type":"ContainerStarted","Data":"cb21c23896ce4a461917db3d79e2a371b5b23ea4da996d49393cecfdc782dabc"} Apr 22 21:09:45.485465 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:45.485467 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" event={"ID":"e942491f-8eab-4373-99e0-3a821262ac0d","Type":"ContainerStarted","Data":"ae5dedd5136b657e8e1d9b06ad655c4ddd2adebe34c2cd9be07b5584d06c554c"} Apr 22 21:09:45.502182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:45.502137 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" podStartSLOduration=2.801824154 podStartE2EDuration="11.502125562s" podCreationTimestamp="2026-04-22 21:09:34 +0000 UTC" firstStartedPulling="2026-04-22 21:09:36.324664407 +0000 UTC m=+33.681479137" lastFinishedPulling="2026-04-22 21:09:45.024965799 +0000 UTC m=+42.381780545" observedRunningTime="2026-04-22 21:09:45.500965424 +0000 UTC m=+42.857780188" watchObservedRunningTime="2026-04-22 21:09:45.502125562 +0000 UTC m=+42.858940313" Apr 22 21:09:45.843502 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:45.843463 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:45.846892 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:45.846872 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/94c92ddb-384e-4003-80b5-1b032afdc994-original-pull-secret\") pod \"global-pull-secret-syncer-gsqgw\" (UID: \"94c92ddb-384e-4003-80b5-1b032afdc994\") " pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:45.847777 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:45.847758 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsqgw" Apr 22 21:09:45.954981 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:45.954950 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gsqgw"] Apr 22 21:09:45.958157 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:09:45.958134 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94c92ddb_384e_4003_80b5_1b032afdc994.slice/crio-5cc5745ef634bb0498d91f1d3eaada3d6cfaf83c91ddafcd0f5f54fdb5bc5044 WatchSource:0}: Error finding container 5cc5745ef634bb0498d91f1d3eaada3d6cfaf83c91ddafcd0f5f54fdb5bc5044: Status 404 returned error can't find the container with id 5cc5745ef634bb0498d91f1d3eaada3d6cfaf83c91ddafcd0f5f54fdb5bc5044 Apr 22 21:09:46.489707 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:46.489671 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gsqgw" event={"ID":"94c92ddb-384e-4003-80b5-1b032afdc994","Type":"ContainerStarted","Data":"5cc5745ef634bb0498d91f1d3eaada3d6cfaf83c91ddafcd0f5f54fdb5bc5044"} Apr 22 21:09:50.504074 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:50.503983 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gsqgw" event={"ID":"94c92ddb-384e-4003-80b5-1b032afdc994","Type":"ContainerStarted","Data":"59594d5868c59b74ae7cb5d09bee2922d41d3bce6725b592bc0ebec591548cbc"} Apr 22 21:09:51.488762 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:51.488720 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:09:51.489041 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:51.488782 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:09:51.489041 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:51.488866 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:09:51.489041 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:51.488883 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:09:51.489041 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:51.488906 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79894b4455-t8gz8: secret "image-registry-tls" not found Apr 22 21:09:51.489041 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:51.488924 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls podName:1161b36d-e8f6-417a-975a-80f7d1eba5e1 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:07.488909253 +0000 UTC m=+64.845723987 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls") pod "dns-default-lszg5" (UID: "1161b36d-e8f6-417a-975a-80f7d1eba5e1") : secret "dns-default-metrics-tls" not found Apr 22 21:09:51.489041 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:09:51.488921 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:09:51.489041 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:51.488975 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls podName:edd98168-b1af-4fed-89c3-c589b37e1931 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:07.488959772 +0000 UTC m=+64.845774509 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls") pod "image-registry-79894b4455-t8gz8" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931") : secret "image-registry-tls" not found Apr 22 21:09:51.489041 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:51.489019 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:09:51.489318 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:09:51.489067 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert podName:18f0e0a7-a91f-432d-bcb0-5e33fb885077 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:07.489056618 +0000 UTC m=+64.845871347 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert") pod "ingress-canary-wzzzt" (UID: "18f0e0a7-a91f-432d-bcb0-5e33fb885077") : secret "canary-serving-cert" not found Apr 22 21:10:01.436220 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:01.436190 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wqvs8" Apr 22 21:10:01.461729 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:01.461666 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gsqgw" podStartSLOduration=44.248525296 podStartE2EDuration="48.461651651s" podCreationTimestamp="2026-04-22 21:09:13 +0000 UTC" firstStartedPulling="2026-04-22 21:09:45.959783804 +0000 UTC m=+43.316598537" lastFinishedPulling="2026-04-22 21:09:50.172910158 +0000 UTC m=+47.529724892" observedRunningTime="2026-04-22 21:09:50.515865138 +0000 UTC m=+47.872679893" watchObservedRunningTime="2026-04-22 21:10:01.461651651 +0000 UTC m=+58.818466404" Apr 22 21:10:07.514225 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:07.514185 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:10:07.514631 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:07.514255 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:10:07.514631 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:07.514298 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:10:07.514631 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:07.514356 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:10:07.514631 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:07.514374 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79894b4455-t8gz8: secret "image-registry-tls" not found Apr 22 21:10:07.514631 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:07.514398 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:10:07.514631 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:07.514402 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:10:07.514631 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:07.514458 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls podName:edd98168-b1af-4fed-89c3-c589b37e1931 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:39.514438274 +0000 UTC m=+96.871253010 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls") pod "image-registry-79894b4455-t8gz8" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931") : secret "image-registry-tls" not found Apr 22 21:10:07.514631 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:07.514472 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert podName:18f0e0a7-a91f-432d-bcb0-5e33fb885077 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:39.514466317 +0000 UTC m=+96.871281047 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert") pod "ingress-canary-wzzzt" (UID: "18f0e0a7-a91f-432d-bcb0-5e33fb885077") : secret "canary-serving-cert" not found Apr 22 21:10:07.514631 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:07.514481 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls podName:1161b36d-e8f6-417a-975a-80f7d1eba5e1 nodeName:}" failed. No retries permitted until 2026-04-22 21:10:39.514476447 +0000 UTC m=+96.871291177 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls") pod "dns-default-lszg5" (UID: "1161b36d-e8f6-417a-975a-80f7d1eba5e1") : secret "dns-default-metrics-tls" not found Apr 22 21:10:07.816474 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:07.816446 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:10:07.818763 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:07.818736 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 21:10:07.827384 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:07.827363 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 21:10:07.827490 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:07.827430 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs podName:01c1be80-c7fc-433f-bf11-a97af5540866 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:11.827402737 +0000 UTC m=+129.184217473 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs") pod "network-metrics-daemon-qvqz5" (UID: "01c1be80-c7fc-433f-bf11-a97af5540866") : secret "metrics-daemon-secret" not found Apr 22 21:10:07.917547 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:07.917517 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b\") pod \"network-check-target-b4vrr\" (UID: \"f1c2f55b-0450-43f7-b50e-57b4c3d15108\") " pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:10:07.919548 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:07.919531 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 21:10:07.929627 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:07.929607 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 21:10:07.941403 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:07.941379 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkg9b\" (UniqueName: \"kubernetes.io/projected/f1c2f55b-0450-43f7-b50e-57b4c3d15108-kube-api-access-jkg9b\") pod \"network-check-target-b4vrr\" (UID: \"f1c2f55b-0450-43f7-b50e-57b4c3d15108\") " pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:10:08.066722 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:08.066636 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gqtcf\"" Apr 22 21:10:08.075343 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:08.075319 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:10:08.182716 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:08.182654 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b4vrr"] Apr 22 21:10:08.186928 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:10:08.186898 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1c2f55b_0450_43f7_b50e_57b4c3d15108.slice/crio-5d6438bbbc3723e9c8f90369bbbcf9375910bcfd74e490cf55b248fdf1573698 WatchSource:0}: Error finding container 5d6438bbbc3723e9c8f90369bbbcf9375910bcfd74e490cf55b248fdf1573698: Status 404 returned error can't find the container with id 5d6438bbbc3723e9c8f90369bbbcf9375910bcfd74e490cf55b248fdf1573698 Apr 22 21:10:08.549604 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:08.549522 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b4vrr" event={"ID":"f1c2f55b-0450-43f7-b50e-57b4c3d15108","Type":"ContainerStarted","Data":"5d6438bbbc3723e9c8f90369bbbcf9375910bcfd74e490cf55b248fdf1573698"} Apr 22 21:10:11.558731 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:11.558693 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b4vrr" event={"ID":"f1c2f55b-0450-43f7-b50e-57b4c3d15108","Type":"ContainerStarted","Data":"982a93a4013ac110306a1765ae8de97c855ac8c0b0f67368a4ea96d791459cea"} Apr 22 21:10:11.559164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:11.558793 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:10:11.571945 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:11.571905 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b4vrr" podStartSLOduration=65.398408762 podStartE2EDuration="1m8.571892573s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:10:08.188961203 +0000 UTC m=+65.545775939" lastFinishedPulling="2026-04-22 21:10:11.362445011 +0000 UTC m=+68.719259750" observedRunningTime="2026-04-22 21:10:11.570838218 +0000 UTC m=+68.927652993" watchObservedRunningTime="2026-04-22 21:10:11.571892573 +0000 UTC m=+68.928707325" Apr 22 21:10:39.566113 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:39.566064 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:10:39.566643 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:39.566129 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:10:39.566643 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:39.566161 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:10:39.566643 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:39.566246 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 21:10:39.566643 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:39.566245 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 21:10:39.566643 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:39.566267 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 21:10:39.566643 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:39.566282 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-79894b4455-t8gz8: secret "image-registry-tls" not found Apr 22 21:10:39.566643 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:39.566306 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls podName:1161b36d-e8f6-417a-975a-80f7d1eba5e1 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:43.566293433 +0000 UTC m=+160.923108164 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls") pod "dns-default-lszg5" (UID: "1161b36d-e8f6-417a-975a-80f7d1eba5e1") : secret "dns-default-metrics-tls" not found Apr 22 21:10:39.566643 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:39.566319 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert podName:18f0e0a7-a91f-432d-bcb0-5e33fb885077 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:43.566313261 +0000 UTC m=+160.923127990 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert") pod "ingress-canary-wzzzt" (UID: "18f0e0a7-a91f-432d-bcb0-5e33fb885077") : secret "canary-serving-cert" not found Apr 22 21:10:39.566643 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:10:39.566329 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls podName:edd98168-b1af-4fed-89c3-c589b37e1931 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:43.566323802 +0000 UTC m=+160.923138531 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls") pod "image-registry-79894b4455-t8gz8" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931") : secret "image-registry-tls" not found Apr 22 21:10:42.563965 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:10:42.563911 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b4vrr" Apr 22 21:11:05.132890 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:05.132859 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t4hx2_83539234-7602-4c9d-a9c8-05dca158b65b/dns-node-resolver/0.log" Apr 22 21:11:05.532787 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:05.532712 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jq6nq_fbd1aca7-dece-43f0-b914-b6993f56f39a/node-ca/0.log" Apr 22 21:11:11.902503 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:11.902445 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:11:11.903021 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:11.902591 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 21:11:11.903021 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:11.902681 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs podName:01c1be80-c7fc-433f-bf11-a97af5540866 nodeName:}" failed. No retries permitted until 2026-04-22 21:13:13.902665185 +0000 UTC m=+251.259479914 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs") pod "network-metrics-daemon-qvqz5" (UID: "01c1be80-c7fc-433f-bf11-a97af5540866") : secret "metrics-daemon-secret" not found Apr 22 21:11:16.357272 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.357232 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-ldx4t"] Apr 22 21:11:16.360270 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.360244 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.362155 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.362126 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 21:11:16.362155 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.362158 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 21:11:16.362333 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.362194 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 21:11:16.362333 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.362158 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 21:11:16.362792 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.362779 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rwlrc\"" Apr 22 21:11:16.368786 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.368767 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ldx4t"] Apr 22 21:11:16.439327 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.439290 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6301c666-2041-4fac-a690-1918445b7057-crio-socket\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.439327 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.439330 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9xxf\" (UniqueName: \"kubernetes.io/projected/6301c666-2041-4fac-a690-1918445b7057-kube-api-access-c9xxf\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.439576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.439352 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6301c666-2041-4fac-a690-1918445b7057-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.439576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.439482 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.439576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.439529 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6301c666-2041-4fac-a690-1918445b7057-data-volume\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.540160 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.540112 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6301c666-2041-4fac-a690-1918445b7057-crio-socket\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.540160 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.540163 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9xxf\" (UniqueName: \"kubernetes.io/projected/6301c666-2041-4fac-a690-1918445b7057-kube-api-access-c9xxf\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.540318 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.540190 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6301c666-2041-4fac-a690-1918445b7057-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.540318 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.540228 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.540318 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.540245 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6301c666-2041-4fac-a690-1918445b7057-crio-socket\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.540490 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:16.540346 2566 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 21:11:16.540490 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:16.540394 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls podName:6301c666-2041-4fac-a690-1918445b7057 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:17.04038046 +0000 UTC m=+134.397195193 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ldx4t" (UID: "6301c666-2041-4fac-a690-1918445b7057") : secret "insights-runtime-extractor-tls" not found Apr 22 21:11:16.540585 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.540533 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6301c666-2041-4fac-a690-1918445b7057-data-volume\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.541347 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.541327 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6301c666-2041-4fac-a690-1918445b7057-data-volume\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.541387 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.541343 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6301c666-2041-4fac-a690-1918445b7057-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:16.558564 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:16.558536 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9xxf\" (UniqueName: \"kubernetes.io/projected/6301c666-2041-4fac-a690-1918445b7057-kube-api-access-c9xxf\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:17.044090 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:17.044051 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:17.044393 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:17.044234 2566 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 21:11:17.044393 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:17.044337 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls podName:6301c666-2041-4fac-a690-1918445b7057 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:18.044314554 +0000 UTC m=+135.401129285 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ldx4t" (UID: "6301c666-2041-4fac-a690-1918445b7057") : secret "insights-runtime-extractor-tls" not found Apr 22 21:11:18.051947 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:18.051903 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:18.052374 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:18.052056 2566 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 21:11:18.052374 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:18.052121 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls podName:6301c666-2041-4fac-a690-1918445b7057 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:20.052105074 +0000 UTC m=+137.408919820 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ldx4t" (UID: "6301c666-2041-4fac-a690-1918445b7057") : secret "insights-runtime-extractor-tls" not found Apr 22 21:11:20.069155 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:20.069112 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:20.069532 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:20.069256 2566 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 21:11:20.069532 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:20.069330 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls podName:6301c666-2041-4fac-a690-1918445b7057 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:24.069314854 +0000 UTC m=+141.426129589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls") pod "insights-runtime-extractor-ldx4t" (UID: "6301c666-2041-4fac-a690-1918445b7057") : secret "insights-runtime-extractor-tls" not found Apr 22 21:11:24.099786 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:24.099746 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:24.102033 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:24.102005 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6301c666-2041-4fac-a690-1918445b7057-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-ldx4t\" (UID: \"6301c666-2041-4fac-a690-1918445b7057\") " pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:24.169676 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:24.169634 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-ldx4t" Apr 22 21:11:24.286976 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:24.286938 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-ldx4t"] Apr 22 21:11:24.291481 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:11:24.291451 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6301c666_2041_4fac_a690_1918445b7057.slice/crio-372b26afd75dba3ab6ba3cba9f15186a6dd9518e1e2a3acef3e983801fd8faa6 WatchSource:0}: Error finding container 372b26afd75dba3ab6ba3cba9f15186a6dd9518e1e2a3acef3e983801fd8faa6: Status 404 returned error can't find the container with id 372b26afd75dba3ab6ba3cba9f15186a6dd9518e1e2a3acef3e983801fd8faa6 Apr 22 21:11:24.734048 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:24.733957 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ldx4t" event={"ID":"6301c666-2041-4fac-a690-1918445b7057","Type":"ContainerStarted","Data":"836f2177ad2ff44828d89bf363cd038348cfb0c4231082ce03170c496d4288dd"} Apr 22 21:11:24.734048 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:24.733997 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ldx4t" event={"ID":"6301c666-2041-4fac-a690-1918445b7057","Type":"ContainerStarted","Data":"372b26afd75dba3ab6ba3cba9f15186a6dd9518e1e2a3acef3e983801fd8faa6"} Apr 22 21:11:25.737801 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:25.737762 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ldx4t" event={"ID":"6301c666-2041-4fac-a690-1918445b7057","Type":"ContainerStarted","Data":"f34e327609540449666f56c4cf3cfa1ee7f023781ba4220a982a0067ea3aec14"} Apr 22 21:11:26.745122 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:26.745082 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-ldx4t" event={"ID":"6301c666-2041-4fac-a690-1918445b7057","Type":"ContainerStarted","Data":"fdc716e7118a225104bda15b3ac11f1f48ffdc603f2249232d21e946365dad4e"} Apr 22 21:11:26.760646 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:26.760592 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-ldx4t" podStartSLOduration=8.856501494 podStartE2EDuration="10.760577684s" podCreationTimestamp="2026-04-22 21:11:16 +0000 UTC" firstStartedPulling="2026-04-22 21:11:24.356680623 +0000 UTC m=+141.713495353" lastFinishedPulling="2026-04-22 21:11:26.260756798 +0000 UTC m=+143.617571543" observedRunningTime="2026-04-22 21:11:26.759479062 +0000 UTC m=+144.116293815" watchObservedRunningTime="2026-04-22 21:11:26.760577684 +0000 UTC m=+144.117392435" Apr 22 21:11:38.612402 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:38.612350 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" podUID="edd98168-b1af-4fed-89c3-c589b37e1931" Apr 22 21:11:38.655655 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:38.655611 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wzzzt" podUID="18f0e0a7-a91f-432d-bcb0-5e33fb885077" Apr 22 21:11:38.663902 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:38.663866 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-lszg5" podUID="1161b36d-e8f6-417a-975a-80f7d1eba5e1" Apr 22 21:11:38.775617 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:38.775587 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:11:38.775793 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:38.775587 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:11:40.157467 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:40.157398 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qvqz5" podUID="01c1be80-c7fc-433f-bf11-a97af5540866" Apr 22 21:11:42.479168 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.479112 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" podUID="b521d1ec-ac06-4801-88b2-25972cfd8773" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.7:8000/readyz\": dial tcp 10.132.0.7:8000: connect: connection refused" Apr 22 21:11:42.787423 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.787315 2566 generic.go:358] "Generic (PLEG): container finished" podID="b521d1ec-ac06-4801-88b2-25972cfd8773" containerID="2168e5fed3e614b2c469de58a237eabc4ac1dc32b2650577f1792fa8ce6db475" exitCode=1 Apr 22 21:11:42.787590 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.787399 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" event={"ID":"b521d1ec-ac06-4801-88b2-25972cfd8773","Type":"ContainerDied","Data":"2168e5fed3e614b2c469de58a237eabc4ac1dc32b2650577f1792fa8ce6db475"} Apr 22 21:11:42.787779 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.787762 2566 scope.go:117] "RemoveContainer" containerID="2168e5fed3e614b2c469de58a237eabc4ac1dc32b2650577f1792fa8ce6db475" Apr 22 21:11:42.788685 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.788664 2566 generic.go:358] "Generic (PLEG): container finished" podID="381b646f-ad72-498b-b7ba-1b4d21500e65" containerID="b5ca2607c1deee2f723757a937bf3b8761970474126152b61838eab804309038" exitCode=255 Apr 22 21:11:42.788744 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.788725 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" event={"ID":"381b646f-ad72-498b-b7ba-1b4d21500e65","Type":"ContainerDied","Data":"b5ca2607c1deee2f723757a937bf3b8761970474126152b61838eab804309038"} Apr 22 21:11:42.788987 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.788972 2566 scope.go:117] "RemoveContainer" containerID="b5ca2607c1deee2f723757a937bf3b8761970474126152b61838eab804309038" Apr 22 21:11:42.960164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.960131 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tp9q7"] Apr 22 21:11:42.963282 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.963257 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:42.967329 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.967305 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 21:11:42.967479 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.967354 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 21:11:42.967479 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.967466 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 21:11:42.967725 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.967706 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qctg5\"" Apr 22 21:11:42.967794 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.967743 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 21:11:42.967848 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.967821 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 21:11:42.968084 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:42.968061 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 21:11:43.053344 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.053247 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fa15de5-6550-43dc-afb0-2feb4e44de89-metrics-client-ca\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.053344 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.053301 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w78hl\" (UniqueName: \"kubernetes.io/projected/5fa15de5-6550-43dc-afb0-2feb4e44de89-kube-api-access-w78hl\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.053576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.053361 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.053576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.053398 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-accelerators-collector-config\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.053576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.053504 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5fa15de5-6550-43dc-afb0-2feb4e44de89-sys\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.053576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.053534 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-textfile\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.053576 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.053560 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5fa15de5-6550-43dc-afb0-2feb4e44de89-root\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.053761 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.053590 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-wtmp\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.053761 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.053611 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-tls\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.153993 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.153956 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5fa15de5-6550-43dc-afb0-2feb4e44de89-sys\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.153993 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.153994 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-textfile\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.153993 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154015 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5fa15de5-6550-43dc-afb0-2feb4e44de89-root\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154336 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154067 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5fa15de5-6550-43dc-afb0-2feb4e44de89-root\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154336 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154068 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-wtmp\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154336 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154099 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5fa15de5-6550-43dc-afb0-2feb4e44de89-sys\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154336 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154110 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-tls\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154336 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154175 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fa15de5-6550-43dc-afb0-2feb4e44de89-metrics-client-ca\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154336 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154190 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-wtmp\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154336 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154213 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w78hl\" (UniqueName: \"kubernetes.io/projected/5fa15de5-6550-43dc-afb0-2feb4e44de89-kube-api-access-w78hl\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154336 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154236 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154336 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154254 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-accelerators-collector-config\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154336 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:43.154277 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 21:11:43.154740 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:11:43.154360 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-tls podName:5fa15de5-6550-43dc-afb0-2feb4e44de89 nodeName:}" failed. No retries permitted until 2026-04-22 21:11:43.654339861 +0000 UTC m=+161.011154607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-tls") pod "node-exporter-tp9q7" (UID: "5fa15de5-6550-43dc-afb0-2feb4e44de89") : secret "node-exporter-tls" not found Apr 22 21:11:43.154740 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154394 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-textfile\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154740 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154715 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fa15de5-6550-43dc-afb0-2feb4e44de89-metrics-client-ca\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.154846 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.154792 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-accelerators-collector-config\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.156572 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.156552 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.162679 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.162654 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w78hl\" (UniqueName: \"kubernetes.io/projected/5fa15de5-6550-43dc-afb0-2feb4e44de89-kube-api-access-w78hl\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.659608 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.659573 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:11:43.660101 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.659629 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-tls\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.660101 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.659659 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:11:43.660101 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.659693 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:11:43.662084 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.662056 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5fa15de5-6550-43dc-afb0-2feb4e44de89-node-exporter-tls\") pod \"node-exporter-tp9q7\" (UID: \"5fa15de5-6550-43dc-afb0-2feb4e44de89\") " pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.662194 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.662166 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"image-registry-79894b4455-t8gz8\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:11:43.662443 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.662403 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1161b36d-e8f6-417a-975a-80f7d1eba5e1-metrics-tls\") pod \"dns-default-lszg5\" (UID: \"1161b36d-e8f6-417a-975a-80f7d1eba5e1\") " pod="openshift-dns/dns-default-lszg5" Apr 22 21:11:43.662538 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.662523 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/18f0e0a7-a91f-432d-bcb0-5e33fb885077-cert\") pod \"ingress-canary-wzzzt\" (UID: \"18f0e0a7-a91f-432d-bcb0-5e33fb885077\") " pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:11:43.792840 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.792799 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" event={"ID":"b521d1ec-ac06-4801-88b2-25972cfd8773","Type":"ContainerStarted","Data":"1d976537fb9ac5e8c341124bf5df79a513e7953e7aa12e53059b2a9867660cb4"} Apr 22 21:11:43.793206 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.793160 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:11:43.793838 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.793817 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84d9f98565-ggst2" Apr 22 21:11:43.794522 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.794503 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78ddf48d7f-x9l2f" event={"ID":"381b646f-ad72-498b-b7ba-1b4d21500e65","Type":"ContainerStarted","Data":"6e6660231e711c4f184ff88bc6c5605762697f25ea47aa1686863b496d891275"} Apr 22 21:11:43.872686 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.872647 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tp9q7" Apr 22 21:11:43.878015 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.877962 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-j9t5d\"" Apr 22 21:11:43.878178 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.878156 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tq52g\"" Apr 22 21:11:43.880760 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:11:43.880735 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fa15de5_6550_43dc_afb0_2feb4e44de89.slice/crio-55685e2bd8b90e15d5fee90c7639c6e0886301fd23c3f32b7d24f7c264943484 WatchSource:0}: Error finding container 55685e2bd8b90e15d5fee90c7639c6e0886301fd23c3f32b7d24f7c264943484: Status 404 returned error can't find the container with id 55685e2bd8b90e15d5fee90c7639c6e0886301fd23c3f32b7d24f7c264943484 Apr 22 21:11:43.886743 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.886712 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:11:43.886857 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:43.886795 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wzzzt" Apr 22 21:11:44.016878 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:44.016844 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wzzzt"] Apr 22 21:11:44.020253 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:11:44.020196 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18f0e0a7_a91f_432d_bcb0_5e33fb885077.slice/crio-bed508cd389e08c6ea5d894375606f8ad7c859672bea8998521087587a7e761e WatchSource:0}: Error finding container bed508cd389e08c6ea5d894375606f8ad7c859672bea8998521087587a7e761e: Status 404 returned error can't find the container with id bed508cd389e08c6ea5d894375606f8ad7c859672bea8998521087587a7e761e Apr 22 21:11:44.037170 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:44.037135 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79894b4455-t8gz8"] Apr 22 21:11:44.040339 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:11:44.040304 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd98168_b1af_4fed_89c3_c589b37e1931.slice/crio-4f8b0f57252806d2739b4048e1dd1af2aee10b9524efa7aebd11e04c9ae912c7 WatchSource:0}: Error finding container 4f8b0f57252806d2739b4048e1dd1af2aee10b9524efa7aebd11e04c9ae912c7: Status 404 returned error can't find the container with id 4f8b0f57252806d2739b4048e1dd1af2aee10b9524efa7aebd11e04c9ae912c7 Apr 22 21:11:44.799023 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:44.798876 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wzzzt" event={"ID":"18f0e0a7-a91f-432d-bcb0-5e33fb885077","Type":"ContainerStarted","Data":"bed508cd389e08c6ea5d894375606f8ad7c859672bea8998521087587a7e761e"} Apr 22 21:11:44.800568 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:44.800537 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tp9q7" event={"ID":"5fa15de5-6550-43dc-afb0-2feb4e44de89","Type":"ContainerStarted","Data":"cab081919ca6a3c1f7bb0a5d096d2f95e05a635983eada733cd8f1e7e335f811"} Apr 22 21:11:44.800674 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:44.800583 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tp9q7" event={"ID":"5fa15de5-6550-43dc-afb0-2feb4e44de89","Type":"ContainerStarted","Data":"55685e2bd8b90e15d5fee90c7639c6e0886301fd23c3f32b7d24f7c264943484"} Apr 22 21:11:44.801949 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:44.801921 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" event={"ID":"edd98168-b1af-4fed-89c3-c589b37e1931","Type":"ContainerStarted","Data":"fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d"} Apr 22 21:11:44.802049 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:44.801959 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" event={"ID":"edd98168-b1af-4fed-89c3-c589b37e1931","Type":"ContainerStarted","Data":"4f8b0f57252806d2739b4048e1dd1af2aee10b9524efa7aebd11e04c9ae912c7"} Apr 22 21:11:44.802108 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:44.802057 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:11:44.831485 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:44.831420 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" podStartSLOduration=161.83139165 podStartE2EDuration="2m41.83139165s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:11:44.830042344 +0000 UTC m=+162.186857090" watchObservedRunningTime="2026-04-22 21:11:44.83139165 +0000 UTC m=+162.188206402" Apr 22 21:11:45.805939 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:45.805908 2566 generic.go:358] "Generic (PLEG): container finished" podID="5fa15de5-6550-43dc-afb0-2feb4e44de89" containerID="cab081919ca6a3c1f7bb0a5d096d2f95e05a635983eada733cd8f1e7e335f811" exitCode=0 Apr 22 21:11:45.806279 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:45.805980 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tp9q7" event={"ID":"5fa15de5-6550-43dc-afb0-2feb4e44de89","Type":"ContainerDied","Data":"cab081919ca6a3c1f7bb0a5d096d2f95e05a635983eada733cd8f1e7e335f811"} Apr 22 21:11:46.810028 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:46.809986 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tp9q7" event={"ID":"5fa15de5-6550-43dc-afb0-2feb4e44de89","Type":"ContainerStarted","Data":"2ca479b769bf7e180ae40cc11e093d370c25f2976e3ee6d371f1461fe3c69db5"} Apr 22 21:11:46.810530 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:46.810037 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tp9q7" event={"ID":"5fa15de5-6550-43dc-afb0-2feb4e44de89","Type":"ContainerStarted","Data":"f64d58fb80a3e2cc91ccf87179007bcb67982fb21953e5e7e5d8c9be564c2091"} Apr 22 21:11:46.811304 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:46.811281 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wzzzt" event={"ID":"18f0e0a7-a91f-432d-bcb0-5e33fb885077","Type":"ContainerStarted","Data":"a83e9bb222018299ac13903aff3445bfc819efc6b333959528a5dca12e30966f"} Apr 22 21:11:46.826644 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:46.826597 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tp9q7" podStartSLOduration=4.077516263 podStartE2EDuration="4.826582584s" podCreationTimestamp="2026-04-22 21:11:42 +0000 UTC" firstStartedPulling="2026-04-22 21:11:43.882666748 +0000 UTC m=+161.239481488" lastFinishedPulling="2026-04-22 21:11:44.631733077 +0000 UTC m=+161.988547809" observedRunningTime="2026-04-22 21:11:46.825682055 +0000 UTC m=+164.182496807" watchObservedRunningTime="2026-04-22 21:11:46.826582584 +0000 UTC m=+164.183397333" Apr 22 21:11:46.838657 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:46.838598 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wzzzt" podStartSLOduration=129.997824669 podStartE2EDuration="2m11.838582708s" podCreationTimestamp="2026-04-22 21:09:35 +0000 UTC" firstStartedPulling="2026-04-22 21:11:44.022359102 +0000 UTC m=+161.379173836" lastFinishedPulling="2026-04-22 21:11:45.863117145 +0000 UTC m=+163.219931875" observedRunningTime="2026-04-22 21:11:46.837676619 +0000 UTC m=+164.194491389" watchObservedRunningTime="2026-04-22 21:11:46.838582708 +0000 UTC m=+164.195397460" Apr 22 21:11:53.130016 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:53.129918 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lszg5" Apr 22 21:11:53.130529 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:53.130086 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:11:53.132033 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:53.132013 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-8d48v\"" Apr 22 21:11:53.140554 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:53.140536 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lszg5" Apr 22 21:11:53.258339 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:53.258306 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lszg5"] Apr 22 21:11:53.261260 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:11:53.261226 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1161b36d_e8f6_417a_975a_80f7d1eba5e1.slice/crio-4f9ebc29e82b7ee884e2446dd1827a32c9c31f7d5437b73cb79570b8b67e1546 WatchSource:0}: Error finding container 4f9ebc29e82b7ee884e2446dd1827a32c9c31f7d5437b73cb79570b8b67e1546: Status 404 returned error can't find the container with id 4f9ebc29e82b7ee884e2446dd1827a32c9c31f7d5437b73cb79570b8b67e1546 Apr 22 21:11:53.829706 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:53.829653 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lszg5" event={"ID":"1161b36d-e8f6-417a-975a-80f7d1eba5e1","Type":"ContainerStarted","Data":"4f9ebc29e82b7ee884e2446dd1827a32c9c31f7d5437b73cb79570b8b67e1546"} Apr 22 21:11:54.833760 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:54.833724 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lszg5" event={"ID":"1161b36d-e8f6-417a-975a-80f7d1eba5e1","Type":"ContainerStarted","Data":"0d4833bc0a6b2cfeda52cf52aaeb1a8eef69e0d2e5ebf5667ab66d44e8291d9a"} Apr 22 21:11:54.833760 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:54.833763 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lszg5" event={"ID":"1161b36d-e8f6-417a-975a-80f7d1eba5e1","Type":"ContainerStarted","Data":"9644fffce0231dc2a17e4f29780e3639b503eccad7aa226bb0fde0216160d74b"} Apr 22 21:11:54.834197 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:54.833854 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-lszg5" Apr 22 21:11:54.846783 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:11:54.846739 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lszg5" podStartSLOduration=138.550999287 podStartE2EDuration="2m19.846723623s" podCreationTimestamp="2026-04-22 21:09:35 +0000 UTC" firstStartedPulling="2026-04-22 21:11:53.263061822 +0000 UTC m=+170.619876555" lastFinishedPulling="2026-04-22 21:11:54.558786158 +0000 UTC m=+171.915600891" observedRunningTime="2026-04-22 21:11:54.846192924 +0000 UTC m=+172.203007677" watchObservedRunningTime="2026-04-22 21:11:54.846723623 +0000 UTC m=+172.203538375" Apr 22 21:12:03.892508 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:03.892473 2566 patch_prober.go:28] interesting pod/image-registry-79894b4455-t8gz8 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 21:12:03.892914 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:03.892536 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" podUID="edd98168-b1af-4fed-89c3-c589b37e1931" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 21:12:04.839040 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:04.839003 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lszg5" Apr 22 21:12:05.810192 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:05.810155 2566 patch_prober.go:28] interesting pod/image-registry-79894b4455-t8gz8 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 21:12:05.810689 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:05.810216 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" podUID="edd98168-b1af-4fed-89c3-c589b37e1931" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 21:12:07.375965 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:07.375928 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-79894b4455-t8gz8"] Apr 22 21:12:07.379706 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:07.379681 2566 patch_prober.go:28] interesting pod/image-registry-79894b4455-t8gz8 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 21:12:07.379839 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:07.379732 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" podUID="edd98168-b1af-4fed-89c3-c589b37e1931" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 21:12:17.380279 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:17.380248 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:12:32.396430 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:32.396373 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" podUID="edd98168-b1af-4fed-89c3-c589b37e1931" containerName="registry" containerID="cri-o://fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d" gracePeriod=30 Apr 22 21:12:33.508790 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:12:33.508755 2566 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd98168_b1af_4fed_89c3_c589b37e1931.slice/crio-conmon-fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d.scope\": RecentStats: unable to find data in memory cache]" Apr 22 21:12:33.638837 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.638813 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:12:33.738468 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.738342 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-image-registry-private-configuration\") pod \"edd98168-b1af-4fed-89c3-c589b37e1931\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " Apr 22 21:12:33.738468 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.738401 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") pod \"edd98168-b1af-4fed-89c3-c589b37e1931\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " Apr 22 21:12:33.738468 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.738443 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/edd98168-b1af-4fed-89c3-c589b37e1931-ca-trust-extracted\") pod \"edd98168-b1af-4fed-89c3-c589b37e1931\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " Apr 22 21:12:33.738468 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.738466 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mfqb\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-kube-api-access-4mfqb\") pod \"edd98168-b1af-4fed-89c3-c589b37e1931\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " Apr 22 21:12:33.738792 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.738508 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-trusted-ca\") pod \"edd98168-b1af-4fed-89c3-c589b37e1931\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " Apr 22 21:12:33.738792 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.738528 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-bound-sa-token\") pod \"edd98168-b1af-4fed-89c3-c589b37e1931\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " Apr 22 21:12:33.738792 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.738551 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-registry-certificates\") pod \"edd98168-b1af-4fed-89c3-c589b37e1931\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " Apr 22 21:12:33.738792 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.738568 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-installation-pull-secrets\") pod \"edd98168-b1af-4fed-89c3-c589b37e1931\" (UID: \"edd98168-b1af-4fed-89c3-c589b37e1931\") " Apr 22 21:12:33.739223 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.739164 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "edd98168-b1af-4fed-89c3-c589b37e1931" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:12:33.739223 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.739171 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "edd98168-b1af-4fed-89c3-c589b37e1931" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:12:33.741119 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.741072 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "edd98168-b1af-4fed-89c3-c589b37e1931" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:12:33.741329 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.741301 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "edd98168-b1af-4fed-89c3-c589b37e1931" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:12:33.741439 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.741361 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-kube-api-access-4mfqb" (OuterVolumeSpecName: "kube-api-access-4mfqb") pod "edd98168-b1af-4fed-89c3-c589b37e1931" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931"). InnerVolumeSpecName "kube-api-access-4mfqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:12:33.741439 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.741361 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "edd98168-b1af-4fed-89c3-c589b37e1931" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:12:33.741439 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.741365 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "edd98168-b1af-4fed-89c3-c589b37e1931" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:12:33.747862 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.747833 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd98168-b1af-4fed-89c3-c589b37e1931-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "edd98168-b1af-4fed-89c3-c589b37e1931" (UID: "edd98168-b1af-4fed-89c3-c589b37e1931"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:12:33.839438 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.839368 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-trusted-ca\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:12:33.839438 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.839430 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-bound-sa-token\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:12:33.839438 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.839445 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/edd98168-b1af-4fed-89c3-c589b37e1931-registry-certificates\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:12:33.839689 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.839461 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-installation-pull-secrets\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:12:33.839689 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.839476 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/edd98168-b1af-4fed-89c3-c589b37e1931-image-registry-private-configuration\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:12:33.839689 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.839489 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-registry-tls\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:12:33.839689 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.839501 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/edd98168-b1af-4fed-89c3-c589b37e1931-ca-trust-extracted\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:12:33.839689 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.839513 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mfqb\" (UniqueName: \"kubernetes.io/projected/edd98168-b1af-4fed-89c3-c589b37e1931-kube-api-access-4mfqb\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:12:33.939914 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.939881 2566 generic.go:358] "Generic (PLEG): container finished" podID="edd98168-b1af-4fed-89c3-c589b37e1931" containerID="fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d" exitCode=0 Apr 22 21:12:33.940077 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.939940 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" Apr 22 21:12:33.940077 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.939962 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" event={"ID":"edd98168-b1af-4fed-89c3-c589b37e1931","Type":"ContainerDied","Data":"fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d"} Apr 22 21:12:33.940077 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.940005 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79894b4455-t8gz8" event={"ID":"edd98168-b1af-4fed-89c3-c589b37e1931","Type":"ContainerDied","Data":"4f8b0f57252806d2739b4048e1dd1af2aee10b9524efa7aebd11e04c9ae912c7"} Apr 22 21:12:33.940077 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.940022 2566 scope.go:117] "RemoveContainer" containerID="fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d" Apr 22 21:12:33.947640 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.947617 2566 scope.go:117] "RemoveContainer" containerID="fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d" Apr 22 21:12:33.948007 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:12:33.947985 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d\": container with ID starting with fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d not found: ID does not exist" containerID="fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d" Apr 22 21:12:33.948094 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.948014 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d"} err="failed to get container status \"fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d\": rpc error: code = NotFound desc = could not find container \"fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d\": container with ID starting with fa8ce26cbf93a1734a0e7c1b931c2b839e9f608a7db301988e1b8c67eeae326d not found: ID does not exist" Apr 22 21:12:33.958965 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.958941 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-79894b4455-t8gz8"] Apr 22 21:12:33.963755 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:33.963736 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-79894b4455-t8gz8"] Apr 22 21:12:35.132375 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:35.132340 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd98168-b1af-4fed-89c3-c589b37e1931" path="/var/lib/kubelet/pods/edd98168-b1af-4fed-89c3-c589b37e1931/volumes" Apr 22 21:12:35.933751 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:35.933713 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" podUID="e942491f-8eab-4373-99e0-3a821262ac0d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 21:12:45.932772 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:45.932724 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" podUID="e942491f-8eab-4373-99e0-3a821262ac0d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 21:12:55.932957 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:55.932914 2566 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" podUID="e942491f-8eab-4373-99e0-3a821262ac0d" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 21:12:55.933450 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:55.932996 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" Apr 22 21:12:55.933497 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:55.933480 2566 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"cb21c23896ce4a461917db3d79e2a371b5b23ea4da996d49393cecfdc782dabc"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 21:12:55.933532 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:55.933516 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" podUID="e942491f-8eab-4373-99e0-3a821262ac0d" containerName="service-proxy" containerID="cri-o://cb21c23896ce4a461917db3d79e2a371b5b23ea4da996d49393cecfdc782dabc" gracePeriod=30 Apr 22 21:12:57.002853 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:57.002816 2566 generic.go:358] "Generic (PLEG): container finished" podID="e942491f-8eab-4373-99e0-3a821262ac0d" containerID="cb21c23896ce4a461917db3d79e2a371b5b23ea4da996d49393cecfdc782dabc" exitCode=2 Apr 22 21:12:57.003269 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:57.002867 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" event={"ID":"e942491f-8eab-4373-99e0-3a821262ac0d","Type":"ContainerDied","Data":"cb21c23896ce4a461917db3d79e2a371b5b23ea4da996d49393cecfdc782dabc"} Apr 22 21:12:57.003269 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:12:57.002900 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7dbb487bf6-hmzzm" event={"ID":"e942491f-8eab-4373-99e0-3a821262ac0d","Type":"ContainerStarted","Data":"583e1a4a103fd5d20b3a2dbeb868416409c1fe343b2e5f3feeb47cd1bc83f65e"} Apr 22 21:13:13.938227 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:13:13.938173 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:13:13.940531 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:13:13.940509 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01c1be80-c7fc-433f-bf11-a97af5540866-metrics-certs\") pod \"network-metrics-daemon-qvqz5\" (UID: \"01c1be80-c7fc-433f-bf11-a97af5540866\") " pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:13:14.133347 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:13:14.133318 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bw4nx\"" Apr 22 21:13:14.140744 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:13:14.140709 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qvqz5" Apr 22 21:13:14.285365 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:13:14.285325 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qvqz5"] Apr 22 21:13:14.288992 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:13:14.288954 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01c1be80_c7fc_433f_bf11_a97af5540866.slice/crio-eb0b2730caa0fcdb71523f23a3dee717cfce7891ba7373e190361414d1b5cfc5 WatchSource:0}: Error finding container eb0b2730caa0fcdb71523f23a3dee717cfce7891ba7373e190361414d1b5cfc5: Status 404 returned error can't find the container with id eb0b2730caa0fcdb71523f23a3dee717cfce7891ba7373e190361414d1b5cfc5 Apr 22 21:13:15.049531 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:13:15.049487 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qvqz5" event={"ID":"01c1be80-c7fc-433f-bf11-a97af5540866","Type":"ContainerStarted","Data":"eb0b2730caa0fcdb71523f23a3dee717cfce7891ba7373e190361414d1b5cfc5"} Apr 22 21:13:16.053993 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:13:16.053950 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qvqz5" event={"ID":"01c1be80-c7fc-433f-bf11-a97af5540866","Type":"ContainerStarted","Data":"b9684307f8462c77f47f3d265a4e4df3a9adfdf1776b849cc7d6cb695823f1a1"} Apr 22 21:13:16.053993 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:13:16.053988 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qvqz5" event={"ID":"01c1be80-c7fc-433f-bf11-a97af5540866","Type":"ContainerStarted","Data":"90430bfce39496eeb4077da5173944600b65070990b2b0043241eec74325c0b1"} Apr 22 21:13:16.068420 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:13:16.068349 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qvqz5" podStartSLOduration=252.10832239 podStartE2EDuration="4m13.068334075s" podCreationTimestamp="2026-04-22 21:09:03 +0000 UTC" firstStartedPulling="2026-04-22 21:13:14.290932123 +0000 UTC m=+251.647746857" lastFinishedPulling="2026-04-22 21:13:15.250943808 +0000 UTC m=+252.607758542" observedRunningTime="2026-04-22 21:13:16.067839256 +0000 UTC m=+253.424654008" watchObservedRunningTime="2026-04-22 21:13:16.068334075 +0000 UTC m=+253.425148827" Apr 22 21:14:03.004794 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:14:03.004762 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:14:03.005312 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:14:03.004956 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:14:03.010629 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:14:03.010609 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 21:15:24.822770 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.822729 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7"] Apr 22 21:15:24.823246 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.822985 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edd98168-b1af-4fed-89c3-c589b37e1931" containerName="registry" Apr 22 21:15:24.823246 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.822997 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd98168-b1af-4fed-89c3-c589b37e1931" containerName="registry" Apr 22 21:15:24.823246 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.823046 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="edd98168-b1af-4fed-89c3-c589b37e1931" containerName="registry" Apr 22 21:15:24.825817 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.825800 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:24.828699 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.828674 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:15:24.828993 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.828972 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bx54h\"" Apr 22 21:15:24.829099 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.829063 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:15:24.833982 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.833958 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7"] Apr 22 21:15:24.921142 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.921101 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:24.921142 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.921145 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:24.921372 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:24.921165 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5xv9\" (UniqueName: \"kubernetes.io/projected/e4164806-c1e4-40e3-9e86-a440dd2b237f-kube-api-access-h5xv9\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:25.021841 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:25.021808 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:25.021841 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:25.021853 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:25.022117 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:25.021882 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5xv9\" (UniqueName: \"kubernetes.io/projected/e4164806-c1e4-40e3-9e86-a440dd2b237f-kube-api-access-h5xv9\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:25.022221 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:25.022197 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:25.022278 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:25.022221 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:25.029600 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:25.029579 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5xv9\" (UniqueName: \"kubernetes.io/projected/e4164806-c1e4-40e3-9e86-a440dd2b237f-kube-api-access-h5xv9\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:25.135529 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:25.135452 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:25.251283 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:25.251248 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7"] Apr 22 21:15:25.254449 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:15:25.254401 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4164806_c1e4_40e3_9e86_a440dd2b237f.slice/crio-1fd41e72aa321cb1dbd36bcef86981cd5c21c8e34303bebb1585030d4a9be30f WatchSource:0}: Error finding container 1fd41e72aa321cb1dbd36bcef86981cd5c21c8e34303bebb1585030d4a9be30f: Status 404 returned error can't find the container with id 1fd41e72aa321cb1dbd36bcef86981cd5c21c8e34303bebb1585030d4a9be30f Apr 22 21:15:25.256268 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:25.256251 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:15:25.380448 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:25.380402 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" event={"ID":"e4164806-c1e4-40e3-9e86-a440dd2b237f","Type":"ContainerStarted","Data":"1fd41e72aa321cb1dbd36bcef86981cd5c21c8e34303bebb1585030d4a9be30f"} Apr 22 21:15:31.397193 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:31.397158 2566 generic.go:358] "Generic (PLEG): container finished" podID="e4164806-c1e4-40e3-9e86-a440dd2b237f" containerID="335db6b47f4088e23a29dc808ce932749b0a609884d38946d77b548d41bd5666" exitCode=0 Apr 22 21:15:31.397193 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:31.397198 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" event={"ID":"e4164806-c1e4-40e3-9e86-a440dd2b237f","Type":"ContainerDied","Data":"335db6b47f4088e23a29dc808ce932749b0a609884d38946d77b548d41bd5666"} Apr 22 21:15:34.407207 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:34.407173 2566 generic.go:358] "Generic (PLEG): container finished" podID="e4164806-c1e4-40e3-9e86-a440dd2b237f" containerID="5d35ef94f6e56eb86408d444cd9b0e4ffb020e5b5832c5270b14eec1f3f81515" exitCode=0 Apr 22 21:15:34.407602 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:34.407214 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" event={"ID":"e4164806-c1e4-40e3-9e86-a440dd2b237f","Type":"ContainerDied","Data":"5d35ef94f6e56eb86408d444cd9b0e4ffb020e5b5832c5270b14eec1f3f81515"} Apr 22 21:15:40.426332 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:40.426292 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" event={"ID":"e4164806-c1e4-40e3-9e86-a440dd2b237f","Type":"ContainerStarted","Data":"cbeb50c2f899d4c34a2f4bf41f4868638e96fd88a3f5fbd26c82afeafdccb5fc"} Apr 22 21:15:40.446544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:40.446490 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" podStartSLOduration=1.40575181 podStartE2EDuration="16.446468895s" podCreationTimestamp="2026-04-22 21:15:24 +0000 UTC" firstStartedPulling="2026-04-22 21:15:25.25637887 +0000 UTC m=+382.613193601" lastFinishedPulling="2026-04-22 21:15:40.297095953 +0000 UTC m=+397.653910686" observedRunningTime="2026-04-22 21:15:40.446465022 +0000 UTC m=+397.803279774" watchObservedRunningTime="2026-04-22 21:15:40.446468895 +0000 UTC m=+397.803283629" Apr 22 21:15:41.430338 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:41.430304 2566 generic.go:358] "Generic (PLEG): container finished" podID="e4164806-c1e4-40e3-9e86-a440dd2b237f" containerID="cbeb50c2f899d4c34a2f4bf41f4868638e96fd88a3f5fbd26c82afeafdccb5fc" exitCode=0 Apr 22 21:15:41.430798 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:41.430348 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" event={"ID":"e4164806-c1e4-40e3-9e86-a440dd2b237f","Type":"ContainerDied","Data":"cbeb50c2f899d4c34a2f4bf41f4868638e96fd88a3f5fbd26c82afeafdccb5fc"} Apr 22 21:15:42.552005 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:42.551977 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:42.652607 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:42.652577 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-util\") pod \"e4164806-c1e4-40e3-9e86-a440dd2b237f\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " Apr 22 21:15:42.652767 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:42.652649 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-bundle\") pod \"e4164806-c1e4-40e3-9e86-a440dd2b237f\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " Apr 22 21:15:42.652767 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:42.652668 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5xv9\" (UniqueName: \"kubernetes.io/projected/e4164806-c1e4-40e3-9e86-a440dd2b237f-kube-api-access-h5xv9\") pod \"e4164806-c1e4-40e3-9e86-a440dd2b237f\" (UID: \"e4164806-c1e4-40e3-9e86-a440dd2b237f\") " Apr 22 21:15:42.653339 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:42.653311 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-bundle" (OuterVolumeSpecName: "bundle") pod "e4164806-c1e4-40e3-9e86-a440dd2b237f" (UID: "e4164806-c1e4-40e3-9e86-a440dd2b237f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:15:42.654845 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:42.654821 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4164806-c1e4-40e3-9e86-a440dd2b237f-kube-api-access-h5xv9" (OuterVolumeSpecName: "kube-api-access-h5xv9") pod "e4164806-c1e4-40e3-9e86-a440dd2b237f" (UID: "e4164806-c1e4-40e3-9e86-a440dd2b237f"). InnerVolumeSpecName "kube-api-access-h5xv9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:15:42.656281 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:42.656260 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-util" (OuterVolumeSpecName: "util") pod "e4164806-c1e4-40e3-9e86-a440dd2b237f" (UID: "e4164806-c1e4-40e3-9e86-a440dd2b237f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:15:42.753969 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:42.753897 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-bundle\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:15:42.753969 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:42.753924 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h5xv9\" (UniqueName: \"kubernetes.io/projected/e4164806-c1e4-40e3-9e86-a440dd2b237f-kube-api-access-h5xv9\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:15:42.753969 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:42.753935 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4164806-c1e4-40e3-9e86-a440dd2b237f-util\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:15:43.436856 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:43.436770 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" event={"ID":"e4164806-c1e4-40e3-9e86-a440dd2b237f","Type":"ContainerDied","Data":"1fd41e72aa321cb1dbd36bcef86981cd5c21c8e34303bebb1585030d4a9be30f"} Apr 22 21:15:43.436856 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:43.436816 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fd41e72aa321cb1dbd36bcef86981cd5c21c8e34303bebb1585030d4a9be30f" Apr 22 21:15:43.436856 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:43.436838 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d4hpb7" Apr 22 21:15:51.726114 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.726074 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd"] Apr 22 21:15:51.726520 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.726300 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4164806-c1e4-40e3-9e86-a440dd2b237f" containerName="pull" Apr 22 21:15:51.726520 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.726311 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4164806-c1e4-40e3-9e86-a440dd2b237f" containerName="pull" Apr 22 21:15:51.726520 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.726327 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4164806-c1e4-40e3-9e86-a440dd2b237f" containerName="extract" Apr 22 21:15:51.726520 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.726332 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4164806-c1e4-40e3-9e86-a440dd2b237f" containerName="extract" Apr 22 21:15:51.726520 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.726341 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4164806-c1e4-40e3-9e86-a440dd2b237f" containerName="util" Apr 22 21:15:51.726520 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.726347 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4164806-c1e4-40e3-9e86-a440dd2b237f" containerName="util" Apr 22 21:15:51.726520 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.726392 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4164806-c1e4-40e3-9e86-a440dd2b237f" containerName="extract" Apr 22 21:15:51.779430 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.779382 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd"] Apr 22 21:15:51.779603 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.779577 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:51.781671 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.781643 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:15:51.781814 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.781718 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:15:51.782263 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.782244 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bx54h\"" Apr 22 21:15:51.818772 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.818731 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:51.818772 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.818777 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6xlb\" (UniqueName: \"kubernetes.io/projected/aa3b0210-08ec-4cc4-aeb5-455d861bec50-kube-api-access-j6xlb\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:51.818973 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.818869 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:51.920192 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.920152 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:51.920192 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.920192 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6xlb\" (UniqueName: \"kubernetes.io/projected/aa3b0210-08ec-4cc4-aeb5-455d861bec50-kube-api-access-j6xlb\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:51.920404 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.920314 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:51.920574 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.920555 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:51.920609 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.920577 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:51.926878 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:51.926853 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6xlb\" (UniqueName: \"kubernetes.io/projected/aa3b0210-08ec-4cc4-aeb5-455d861bec50-kube-api-access-j6xlb\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:52.088875 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:52.088836 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:52.212741 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:52.212692 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd"] Apr 22 21:15:52.215819 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:15:52.215788 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa3b0210_08ec_4cc4_aeb5_455d861bec50.slice/crio-0bb99a4f86455d849c97036f9923f28705d784ec61801ee638aea975ed0d2d8e WatchSource:0}: Error finding container 0bb99a4f86455d849c97036f9923f28705d784ec61801ee638aea975ed0d2d8e: Status 404 returned error can't find the container with id 0bb99a4f86455d849c97036f9923f28705d784ec61801ee638aea975ed0d2d8e Apr 22 21:15:52.460177 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:52.460142 2566 generic.go:358] "Generic (PLEG): container finished" podID="aa3b0210-08ec-4cc4-aeb5-455d861bec50" containerID="28434569a628ce9464143fbd6f26c62961c27b187ff374e20a3a2d472226d12a" exitCode=0 Apr 22 21:15:52.460340 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:52.460188 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" event={"ID":"aa3b0210-08ec-4cc4-aeb5-455d861bec50","Type":"ContainerDied","Data":"28434569a628ce9464143fbd6f26c62961c27b187ff374e20a3a2d472226d12a"} Apr 22 21:15:52.460340 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:52.460213 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" event={"ID":"aa3b0210-08ec-4cc4-aeb5-455d861bec50","Type":"ContainerStarted","Data":"0bb99a4f86455d849c97036f9923f28705d784ec61801ee638aea975ed0d2d8e"} Apr 22 21:15:55.470067 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:55.470033 2566 generic.go:358] "Generic (PLEG): container finished" podID="aa3b0210-08ec-4cc4-aeb5-455d861bec50" containerID="33dbac724ac256cf34df35016a08d68126561612992d1705f6419c7e0726e6eb" exitCode=0 Apr 22 21:15:55.470483 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:55.470110 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" event={"ID":"aa3b0210-08ec-4cc4-aeb5-455d861bec50","Type":"ContainerDied","Data":"33dbac724ac256cf34df35016a08d68126561612992d1705f6419c7e0726e6eb"} Apr 22 21:15:56.474212 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:56.474174 2566 generic.go:358] "Generic (PLEG): container finished" podID="aa3b0210-08ec-4cc4-aeb5-455d861bec50" containerID="3b3100d3682a3ee0a9a3e84a6db8fdba9413b485327e178db0c9b87792b9aaed" exitCode=0 Apr 22 21:15:56.474722 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:56.474227 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" event={"ID":"aa3b0210-08ec-4cc4-aeb5-455d861bec50","Type":"ContainerDied","Data":"3b3100d3682a3ee0a9a3e84a6db8fdba9413b485327e178db0c9b87792b9aaed"} Apr 22 21:15:57.592742 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:57.592711 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:15:57.663852 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:57.663815 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6xlb\" (UniqueName: \"kubernetes.io/projected/aa3b0210-08ec-4cc4-aeb5-455d861bec50-kube-api-access-j6xlb\") pod \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " Apr 22 21:15:57.664040 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:57.663871 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-util\") pod \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " Apr 22 21:15:57.664040 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:57.663900 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-bundle\") pod \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\" (UID: \"aa3b0210-08ec-4cc4-aeb5-455d861bec50\") " Apr 22 21:15:57.664356 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:57.664324 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-bundle" (OuterVolumeSpecName: "bundle") pod "aa3b0210-08ec-4cc4-aeb5-455d861bec50" (UID: "aa3b0210-08ec-4cc4-aeb5-455d861bec50"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:15:57.666066 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:57.666038 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa3b0210-08ec-4cc4-aeb5-455d861bec50-kube-api-access-j6xlb" (OuterVolumeSpecName: "kube-api-access-j6xlb") pod "aa3b0210-08ec-4cc4-aeb5-455d861bec50" (UID: "aa3b0210-08ec-4cc4-aeb5-455d861bec50"). InnerVolumeSpecName "kube-api-access-j6xlb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:15:57.737791 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:57.737701 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-util" (OuterVolumeSpecName: "util") pod "aa3b0210-08ec-4cc4-aeb5-455d861bec50" (UID: "aa3b0210-08ec-4cc4-aeb5-455d861bec50"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:15:57.764614 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:57.764581 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j6xlb\" (UniqueName: \"kubernetes.io/projected/aa3b0210-08ec-4cc4-aeb5-455d861bec50-kube-api-access-j6xlb\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:15:57.764614 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:57.764612 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-util\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:15:57.764739 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:57.764622 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aa3b0210-08ec-4cc4-aeb5-455d861bec50-bundle\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:15:58.481044 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:58.481011 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" event={"ID":"aa3b0210-08ec-4cc4-aeb5-455d861bec50","Type":"ContainerDied","Data":"0bb99a4f86455d849c97036f9923f28705d784ec61801ee638aea975ed0d2d8e"} Apr 22 21:15:58.481044 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:58.481048 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bb99a4f86455d849c97036f9923f28705d784ec61801ee638aea975ed0d2d8e" Apr 22 21:15:58.481250 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:15:58.481056 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f45mwd" Apr 22 21:16:00.019296 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.019259 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-x8lxp"] Apr 22 21:16:00.019695 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.019506 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa3b0210-08ec-4cc4-aeb5-455d861bec50" containerName="util" Apr 22 21:16:00.019695 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.019518 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3b0210-08ec-4cc4-aeb5-455d861bec50" containerName="util" Apr 22 21:16:00.019695 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.019531 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa3b0210-08ec-4cc4-aeb5-455d861bec50" containerName="extract" Apr 22 21:16:00.019695 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.019538 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3b0210-08ec-4cc4-aeb5-455d861bec50" containerName="extract" Apr 22 21:16:00.019695 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.019548 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa3b0210-08ec-4cc4-aeb5-455d861bec50" containerName="pull" Apr 22 21:16:00.019695 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.019553 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3b0210-08ec-4cc4-aeb5-455d861bec50" containerName="pull" Apr 22 21:16:00.019695 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.019595 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa3b0210-08ec-4cc4-aeb5-455d861bec50" containerName="extract" Apr 22 21:16:00.024034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.024013 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" Apr 22 21:16:00.025842 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.025817 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 21:16:00.026325 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.026309 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 21:16:00.026371 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.026311 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-hj5fd\"" Apr 22 21:16:00.029085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.029060 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-x8lxp"] Apr 22 21:16:00.082012 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.081969 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rfjg\" (UniqueName: \"kubernetes.io/projected/cd85a0e5-8d2e-474b-a9cf-732f436c9e94-kube-api-access-4rfjg\") pod \"cert-manager-cainjector-68b757865b-x8lxp\" (UID: \"cd85a0e5-8d2e-474b-a9cf-732f436c9e94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" Apr 22 21:16:00.082183 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.082016 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd85a0e5-8d2e-474b-a9cf-732f436c9e94-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-x8lxp\" (UID: \"cd85a0e5-8d2e-474b-a9cf-732f436c9e94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" Apr 22 21:16:00.183423 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.183383 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4rfjg\" (UniqueName: \"kubernetes.io/projected/cd85a0e5-8d2e-474b-a9cf-732f436c9e94-kube-api-access-4rfjg\") pod \"cert-manager-cainjector-68b757865b-x8lxp\" (UID: \"cd85a0e5-8d2e-474b-a9cf-732f436c9e94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" Apr 22 21:16:00.183596 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.183453 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd85a0e5-8d2e-474b-a9cf-732f436c9e94-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-x8lxp\" (UID: \"cd85a0e5-8d2e-474b-a9cf-732f436c9e94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" Apr 22 21:16:00.191087 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.191061 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cd85a0e5-8d2e-474b-a9cf-732f436c9e94-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-x8lxp\" (UID: \"cd85a0e5-8d2e-474b-a9cf-732f436c9e94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" Apr 22 21:16:00.191198 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.191140 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rfjg\" (UniqueName: \"kubernetes.io/projected/cd85a0e5-8d2e-474b-a9cf-732f436c9e94-kube-api-access-4rfjg\") pod \"cert-manager-cainjector-68b757865b-x8lxp\" (UID: \"cd85a0e5-8d2e-474b-a9cf-732f436c9e94\") " pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" Apr 22 21:16:00.334176 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.334138 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" Apr 22 21:16:00.450674 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.450625 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-x8lxp"] Apr 22 21:16:00.453360 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:16:00.453331 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd85a0e5_8d2e_474b_a9cf_732f436c9e94.slice/crio-f7349b6b9d752df0961c5f593f5716eb4144f9076061840d1c47de336158e97e WatchSource:0}: Error finding container f7349b6b9d752df0961c5f593f5716eb4144f9076061840d1c47de336158e97e: Status 404 returned error can't find the container with id f7349b6b9d752df0961c5f593f5716eb4144f9076061840d1c47de336158e97e Apr 22 21:16:00.487959 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:00.487922 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" event={"ID":"cd85a0e5-8d2e-474b-a9cf-732f436c9e94","Type":"ContainerStarted","Data":"f7349b6b9d752df0961c5f593f5716eb4144f9076061840d1c47de336158e97e"} Apr 22 21:16:03.497320 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:03.497280 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" event={"ID":"cd85a0e5-8d2e-474b-a9cf-732f436c9e94","Type":"ContainerStarted","Data":"412c4818e5bdee51232127f3fde21d2a299e5715b63430074c7a0109ed2f959e"} Apr 22 21:16:03.511387 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:03.511321 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-x8lxp" podStartSLOduration=1.012208935 podStartE2EDuration="3.511306908s" podCreationTimestamp="2026-04-22 21:16:00 +0000 UTC" firstStartedPulling="2026-04-22 21:16:00.455177552 +0000 UTC m=+417.811992283" lastFinishedPulling="2026-04-22 21:16:02.954275526 +0000 UTC m=+420.311090256" observedRunningTime="2026-04-22 21:16:03.510510215 +0000 UTC m=+420.867324959" watchObservedRunningTime="2026-04-22 21:16:03.511306908 +0000 UTC m=+420.868121659" Apr 22 21:16:08.994870 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:08.994808 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw"] Apr 22 21:16:08.998442 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:08.998402 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.000538 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.000514 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:16:09.000711 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.000547 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:16:09.000846 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.000829 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bx54h\"" Apr 22 21:16:09.005633 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.005387 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw"] Apr 22 21:16:09.046390 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.046323 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.046591 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.046432 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgjlr\" (UniqueName: \"kubernetes.io/projected/12777f1d-3197-4ddd-93fa-249cc2c76495-kube-api-access-kgjlr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.046591 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.046459 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.147380 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.147342 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgjlr\" (UniqueName: \"kubernetes.io/projected/12777f1d-3197-4ddd-93fa-249cc2c76495-kube-api-access-kgjlr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.147599 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.147392 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.147599 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.147465 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.147745 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.147723 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.147859 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.147840 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.154518 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.154493 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgjlr\" (UniqueName: \"kubernetes.io/projected/12777f1d-3197-4ddd-93fa-249cc2c76495-kube-api-access-kgjlr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.308681 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.308579 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:09.425944 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.425915 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw"] Apr 22 21:16:09.428557 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:16:09.428526 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12777f1d_3197_4ddd_93fa_249cc2c76495.slice/crio-a1f5b1a6f7a84bcdde2931ce2ff951fbdf9894d71e7ee73042de3316f590d605 WatchSource:0}: Error finding container a1f5b1a6f7a84bcdde2931ce2ff951fbdf9894d71e7ee73042de3316f590d605: Status 404 returned error can't find the container with id a1f5b1a6f7a84bcdde2931ce2ff951fbdf9894d71e7ee73042de3316f590d605 Apr 22 21:16:09.515391 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.515348 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" event={"ID":"12777f1d-3197-4ddd-93fa-249cc2c76495","Type":"ContainerStarted","Data":"61f03e2e8ce3b55457cd30d24fd2c82178b6e1313b1dd689f0940767c41c3109"} Apr 22 21:16:09.515391 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:09.515394 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" event={"ID":"12777f1d-3197-4ddd-93fa-249cc2c76495","Type":"ContainerStarted","Data":"a1f5b1a6f7a84bcdde2931ce2ff951fbdf9894d71e7ee73042de3316f590d605"} Apr 22 21:16:10.519468 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:10.519382 2566 generic.go:358] "Generic (PLEG): container finished" podID="12777f1d-3197-4ddd-93fa-249cc2c76495" containerID="61f03e2e8ce3b55457cd30d24fd2c82178b6e1313b1dd689f0940767c41c3109" exitCode=0 Apr 22 21:16:10.519468 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:10.519442 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" event={"ID":"12777f1d-3197-4ddd-93fa-249cc2c76495","Type":"ContainerDied","Data":"61f03e2e8ce3b55457cd30d24fd2c82178b6e1313b1dd689f0940767c41c3109"} Apr 22 21:16:11.523103 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:11.523066 2566 generic.go:358] "Generic (PLEG): container finished" podID="12777f1d-3197-4ddd-93fa-249cc2c76495" containerID="b7cb01952a3254f5875d6dcfda27346b0498521bbdd56ffa96e912f1b5b2ace5" exitCode=0 Apr 22 21:16:11.523479 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:11.523116 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" event={"ID":"12777f1d-3197-4ddd-93fa-249cc2c76495","Type":"ContainerDied","Data":"b7cb01952a3254f5875d6dcfda27346b0498521bbdd56ffa96e912f1b5b2ace5"} Apr 22 21:16:12.528042 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.528001 2566 generic.go:358] "Generic (PLEG): container finished" podID="12777f1d-3197-4ddd-93fa-249cc2c76495" containerID="0ae2b43c4924698ae6f21f3b6615129af2d1ff67edad8a16b17788a56d33ecd3" exitCode=0 Apr 22 21:16:12.528042 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.528044 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" event={"ID":"12777f1d-3197-4ddd-93fa-249cc2c76495","Type":"ContainerDied","Data":"0ae2b43c4924698ae6f21f3b6615129af2d1ff67edad8a16b17788a56d33ecd3"} Apr 22 21:16:12.611677 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.611641 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-vbw7t"] Apr 22 21:16:12.614426 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.614395 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-vbw7t" Apr 22 21:16:12.616287 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.616267 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-hq4bv\"" Apr 22 21:16:12.619870 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.619844 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-vbw7t"] Apr 22 21:16:12.675487 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.675440 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1f73381-966d-429c-9316-4e69ed5fdc23-bound-sa-token\") pod \"cert-manager-79c8d999ff-vbw7t\" (UID: \"e1f73381-966d-429c-9316-4e69ed5fdc23\") " pod="cert-manager/cert-manager-79c8d999ff-vbw7t" Apr 22 21:16:12.675487 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.675487 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62vfr\" (UniqueName: \"kubernetes.io/projected/e1f73381-966d-429c-9316-4e69ed5fdc23-kube-api-access-62vfr\") pod \"cert-manager-79c8d999ff-vbw7t\" (UID: \"e1f73381-966d-429c-9316-4e69ed5fdc23\") " pod="cert-manager/cert-manager-79c8d999ff-vbw7t" Apr 22 21:16:12.776459 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.776372 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1f73381-966d-429c-9316-4e69ed5fdc23-bound-sa-token\") pod \"cert-manager-79c8d999ff-vbw7t\" (UID: \"e1f73381-966d-429c-9316-4e69ed5fdc23\") " pod="cert-manager/cert-manager-79c8d999ff-vbw7t" Apr 22 21:16:12.776459 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.776454 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62vfr\" (UniqueName: \"kubernetes.io/projected/e1f73381-966d-429c-9316-4e69ed5fdc23-kube-api-access-62vfr\") pod \"cert-manager-79c8d999ff-vbw7t\" (UID: \"e1f73381-966d-429c-9316-4e69ed5fdc23\") " pod="cert-manager/cert-manager-79c8d999ff-vbw7t" Apr 22 21:16:12.783621 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.783552 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e1f73381-966d-429c-9316-4e69ed5fdc23-bound-sa-token\") pod \"cert-manager-79c8d999ff-vbw7t\" (UID: \"e1f73381-966d-429c-9316-4e69ed5fdc23\") " pod="cert-manager/cert-manager-79c8d999ff-vbw7t" Apr 22 21:16:12.783750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.783695 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62vfr\" (UniqueName: \"kubernetes.io/projected/e1f73381-966d-429c-9316-4e69ed5fdc23-kube-api-access-62vfr\") pod \"cert-manager-79c8d999ff-vbw7t\" (UID: \"e1f73381-966d-429c-9316-4e69ed5fdc23\") " pod="cert-manager/cert-manager-79c8d999ff-vbw7t" Apr 22 21:16:12.924011 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:12.923959 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-vbw7t" Apr 22 21:16:13.042272 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.042247 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-vbw7t"] Apr 22 21:16:13.044535 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:16:13.044505 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1f73381_966d_429c_9316_4e69ed5fdc23.slice/crio-66333df913d61e9b2838e15ccde918010e8fb264898279358079de73d52af38d WatchSource:0}: Error finding container 66333df913d61e9b2838e15ccde918010e8fb264898279358079de73d52af38d: Status 404 returned error can't find the container with id 66333df913d61e9b2838e15ccde918010e8fb264898279358079de73d52af38d Apr 22 21:16:13.531635 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.531592 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-vbw7t" event={"ID":"e1f73381-966d-429c-9316-4e69ed5fdc23","Type":"ContainerStarted","Data":"d187e9eee949e10d654f396d67d3344f95e38ac23bda21a7e9665358a13f7f45"} Apr 22 21:16:13.532031 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.531641 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-vbw7t" event={"ID":"e1f73381-966d-429c-9316-4e69ed5fdc23","Type":"ContainerStarted","Data":"66333df913d61e9b2838e15ccde918010e8fb264898279358079de73d52af38d"} Apr 22 21:16:13.547567 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.547520 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-vbw7t" podStartSLOduration=1.547504601 podStartE2EDuration="1.547504601s" podCreationTimestamp="2026-04-22 21:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:16:13.546712091 +0000 UTC m=+430.903526844" watchObservedRunningTime="2026-04-22 21:16:13.547504601 +0000 UTC m=+430.904319354" Apr 22 21:16:13.652727 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.652704 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:13.785861 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.785767 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-bundle\") pod \"12777f1d-3197-4ddd-93fa-249cc2c76495\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " Apr 22 21:16:13.785861 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.785832 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-util\") pod \"12777f1d-3197-4ddd-93fa-249cc2c76495\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " Apr 22 21:16:13.786082 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.785870 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgjlr\" (UniqueName: \"kubernetes.io/projected/12777f1d-3197-4ddd-93fa-249cc2c76495-kube-api-access-kgjlr\") pod \"12777f1d-3197-4ddd-93fa-249cc2c76495\" (UID: \"12777f1d-3197-4ddd-93fa-249cc2c76495\") " Apr 22 21:16:13.786718 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.786673 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-bundle" (OuterVolumeSpecName: "bundle") pod "12777f1d-3197-4ddd-93fa-249cc2c76495" (UID: "12777f1d-3197-4ddd-93fa-249cc2c76495"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:16:13.787991 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.787937 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12777f1d-3197-4ddd-93fa-249cc2c76495-kube-api-access-kgjlr" (OuterVolumeSpecName: "kube-api-access-kgjlr") pod "12777f1d-3197-4ddd-93fa-249cc2c76495" (UID: "12777f1d-3197-4ddd-93fa-249cc2c76495"). InnerVolumeSpecName "kube-api-access-kgjlr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:16:13.793143 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.793111 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-util" (OuterVolumeSpecName: "util") pod "12777f1d-3197-4ddd-93fa-249cc2c76495" (UID: "12777f1d-3197-4ddd-93fa-249cc2c76495"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:16:13.887128 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.887090 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-util\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:16:13.887128 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.887122 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kgjlr\" (UniqueName: \"kubernetes.io/projected/12777f1d-3197-4ddd-93fa-249cc2c76495-kube-api-access-kgjlr\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:16:13.887128 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:13.887134 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/12777f1d-3197-4ddd-93fa-249cc2c76495-bundle\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:16:14.536262 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:14.536226 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" Apr 22 21:16:14.536262 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:14.536230 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5dqbgw" event={"ID":"12777f1d-3197-4ddd-93fa-249cc2c76495","Type":"ContainerDied","Data":"a1f5b1a6f7a84bcdde2931ce2ff951fbdf9894d71e7ee73042de3316f590d605"} Apr 22 21:16:14.536666 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:14.536282 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1f5b1a6f7a84bcdde2931ce2ff951fbdf9894d71e7ee73042de3316f590d605" Apr 22 21:16:26.014105 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.014017 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn"] Apr 22 21:16:26.014530 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.014288 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12777f1d-3197-4ddd-93fa-249cc2c76495" containerName="util" Apr 22 21:16:26.014530 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.014300 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="12777f1d-3197-4ddd-93fa-249cc2c76495" containerName="util" Apr 22 21:16:26.014530 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.014313 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12777f1d-3197-4ddd-93fa-249cc2c76495" containerName="extract" Apr 22 21:16:26.014530 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.014318 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="12777f1d-3197-4ddd-93fa-249cc2c76495" containerName="extract" Apr 22 21:16:26.014530 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.014327 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12777f1d-3197-4ddd-93fa-249cc2c76495" containerName="pull" Apr 22 21:16:26.014530 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.014333 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="12777f1d-3197-4ddd-93fa-249cc2c76495" containerName="pull" Apr 22 21:16:26.014530 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.014371 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="12777f1d-3197-4ddd-93fa-249cc2c76495" containerName="extract" Apr 22 21:16:26.021992 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.021973 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.024198 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.024170 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bx54h\"" Apr 22 21:16:26.024453 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.024426 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:16:26.024610 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.024591 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:16:26.025891 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.025867 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn"] Apr 22 21:16:26.175222 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.175163 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.175393 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.175298 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.175393 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.175346 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b58lb\" (UniqueName: \"kubernetes.io/projected/38376534-3a94-45f5-998f-876c27c061fa-kube-api-access-b58lb\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.276083 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.275995 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.276083 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.276042 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b58lb\" (UniqueName: \"kubernetes.io/projected/38376534-3a94-45f5-998f-876c27c061fa-kube-api-access-b58lb\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.276291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.276146 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.276495 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.276470 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.276495 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.276487 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.284159 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.284136 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b58lb\" (UniqueName: \"kubernetes.io/projected/38376534-3a94-45f5-998f-876c27c061fa-kube-api-access-b58lb\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.331542 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.331512 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:26.451395 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.451359 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn"] Apr 22 21:16:26.454358 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:16:26.454328 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38376534_3a94_45f5_998f_876c27c061fa.slice/crio-ed1e82839fde4ae841a54c29cdedd72f2815bb5d8650f88f94054b424edee628 WatchSource:0}: Error finding container ed1e82839fde4ae841a54c29cdedd72f2815bb5d8650f88f94054b424edee628: Status 404 returned error can't find the container with id ed1e82839fde4ae841a54c29cdedd72f2815bb5d8650f88f94054b424edee628 Apr 22 21:16:26.572368 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.572325 2566 generic.go:358] "Generic (PLEG): container finished" podID="38376534-3a94-45f5-998f-876c27c061fa" containerID="338156f4cf40c50ebc246ae7e2cc1f0d1304b66d4dadcadfd025e6b5ed952c17" exitCode=0 Apr 22 21:16:26.572544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.572375 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" event={"ID":"38376534-3a94-45f5-998f-876c27c061fa","Type":"ContainerDied","Data":"338156f4cf40c50ebc246ae7e2cc1f0d1304b66d4dadcadfd025e6b5ed952c17"} Apr 22 21:16:26.572544 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.572402 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" event={"ID":"38376534-3a94-45f5-998f-876c27c061fa","Type":"ContainerStarted","Data":"ed1e82839fde4ae841a54c29cdedd72f2815bb5d8650f88f94054b424edee628"} Apr 22 21:16:26.767229 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.767193 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j"] Apr 22 21:16:26.770852 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.770828 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:26.773732 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.773705 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 21:16:26.773938 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.773914 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 21:16:26.773938 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.773928 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 21:16:26.774129 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.774008 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-s8gxl\"" Apr 22 21:16:26.774211 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.774196 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 21:16:26.790017 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.789990 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j"] Apr 22 21:16:26.880305 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.880261 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d3599da-edfb-4118-bf7e-9534da06b88b-webhook-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-2kn6j\" (UID: \"1d3599da-edfb-4118-bf7e-9534da06b88b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:26.880533 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.880325 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghshf\" (UniqueName: \"kubernetes.io/projected/1d3599da-edfb-4118-bf7e-9534da06b88b-kube-api-access-ghshf\") pod \"opendatahub-operator-controller-manager-65d8664856-2kn6j\" (UID: \"1d3599da-edfb-4118-bf7e-9534da06b88b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:26.880533 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.880371 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d3599da-edfb-4118-bf7e-9534da06b88b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-2kn6j\" (UID: \"1d3599da-edfb-4118-bf7e-9534da06b88b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:26.980872 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.980829 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d3599da-edfb-4118-bf7e-9534da06b88b-webhook-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-2kn6j\" (UID: \"1d3599da-edfb-4118-bf7e-9534da06b88b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:26.981050 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.980881 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghshf\" (UniqueName: \"kubernetes.io/projected/1d3599da-edfb-4118-bf7e-9534da06b88b-kube-api-access-ghshf\") pod \"opendatahub-operator-controller-manager-65d8664856-2kn6j\" (UID: \"1d3599da-edfb-4118-bf7e-9534da06b88b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:26.981050 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.980909 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d3599da-edfb-4118-bf7e-9534da06b88b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-2kn6j\" (UID: \"1d3599da-edfb-4118-bf7e-9534da06b88b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:26.983336 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.983309 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1d3599da-edfb-4118-bf7e-9534da06b88b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-2kn6j\" (UID: \"1d3599da-edfb-4118-bf7e-9534da06b88b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:26.983469 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.983345 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1d3599da-edfb-4118-bf7e-9534da06b88b-webhook-cert\") pod \"opendatahub-operator-controller-manager-65d8664856-2kn6j\" (UID: \"1d3599da-edfb-4118-bf7e-9534da06b88b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:26.987853 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:26.987829 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghshf\" (UniqueName: \"kubernetes.io/projected/1d3599da-edfb-4118-bf7e-9534da06b88b-kube-api-access-ghshf\") pod \"opendatahub-operator-controller-manager-65d8664856-2kn6j\" (UID: \"1d3599da-edfb-4118-bf7e-9534da06b88b\") " pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:27.080732 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:27.080710 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:27.208555 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:27.208527 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j"] Apr 22 21:16:27.219201 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:16:27.219166 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d3599da_edfb_4118_bf7e_9534da06b88b.slice/crio-1c536fbebe2cf5a7c01cb9898010303c35f7321b21e143ad919d7e5401be8076 WatchSource:0}: Error finding container 1c536fbebe2cf5a7c01cb9898010303c35f7321b21e143ad919d7e5401be8076: Status 404 returned error can't find the container with id 1c536fbebe2cf5a7c01cb9898010303c35f7321b21e143ad919d7e5401be8076 Apr 22 21:16:27.576816 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:27.576776 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" event={"ID":"1d3599da-edfb-4118-bf7e-9534da06b88b","Type":"ContainerStarted","Data":"1c536fbebe2cf5a7c01cb9898010303c35f7321b21e143ad919d7e5401be8076"} Apr 22 21:16:27.578494 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:27.578459 2566 generic.go:358] "Generic (PLEG): container finished" podID="38376534-3a94-45f5-998f-876c27c061fa" containerID="ab99c57720a32ec01d2f621c4d7af29392c916a337871986340321e40df8e276" exitCode=0 Apr 22 21:16:27.578649 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:27.578497 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" event={"ID":"38376534-3a94-45f5-998f-876c27c061fa","Type":"ContainerDied","Data":"ab99c57720a32ec01d2f621c4d7af29392c916a337871986340321e40df8e276"} Apr 22 21:16:28.583989 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:28.583948 2566 generic.go:358] "Generic (PLEG): container finished" podID="38376534-3a94-45f5-998f-876c27c061fa" containerID="93a2322bc99aa33ac7e412bdfc91a2555110ebbe369fff4e519b7ddb74f42c33" exitCode=0 Apr 22 21:16:28.584362 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:28.584025 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" event={"ID":"38376534-3a94-45f5-998f-876c27c061fa","Type":"ContainerDied","Data":"93a2322bc99aa33ac7e412bdfc91a2555110ebbe369fff4e519b7ddb74f42c33"} Apr 22 21:16:29.839971 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:29.839911 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:29.904985 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:29.904960 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-bundle\") pod \"38376534-3a94-45f5-998f-876c27c061fa\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " Apr 22 21:16:29.905082 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:29.904998 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b58lb\" (UniqueName: \"kubernetes.io/projected/38376534-3a94-45f5-998f-876c27c061fa-kube-api-access-b58lb\") pod \"38376534-3a94-45f5-998f-876c27c061fa\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " Apr 22 21:16:29.905082 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:29.905031 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-util\") pod \"38376534-3a94-45f5-998f-876c27c061fa\" (UID: \"38376534-3a94-45f5-998f-876c27c061fa\") " Apr 22 21:16:29.906164 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:29.906134 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-bundle" (OuterVolumeSpecName: "bundle") pod "38376534-3a94-45f5-998f-876c27c061fa" (UID: "38376534-3a94-45f5-998f-876c27c061fa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:16:29.906926 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:29.906905 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38376534-3a94-45f5-998f-876c27c061fa-kube-api-access-b58lb" (OuterVolumeSpecName: "kube-api-access-b58lb") pod "38376534-3a94-45f5-998f-876c27c061fa" (UID: "38376534-3a94-45f5-998f-876c27c061fa"). InnerVolumeSpecName "kube-api-access-b58lb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:16:29.911209 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:29.911178 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-util" (OuterVolumeSpecName: "util") pod "38376534-3a94-45f5-998f-876c27c061fa" (UID: "38376534-3a94-45f5-998f-876c27c061fa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:16:30.006490 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:30.006450 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-bundle\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:16:30.006490 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:30.006488 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b58lb\" (UniqueName: \"kubernetes.io/projected/38376534-3a94-45f5-998f-876c27c061fa-kube-api-access-b58lb\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:16:30.006712 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:30.006502 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/38376534-3a94-45f5-998f-876c27c061fa-util\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:16:30.594835 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:30.594807 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" Apr 22 21:16:30.594835 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:30.594819 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94xctn" event={"ID":"38376534-3a94-45f5-998f-876c27c061fa","Type":"ContainerDied","Data":"ed1e82839fde4ae841a54c29cdedd72f2815bb5d8650f88f94054b424edee628"} Apr 22 21:16:30.595053 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:30.594860 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed1e82839fde4ae841a54c29cdedd72f2815bb5d8650f88f94054b424edee628" Apr 22 21:16:30.596296 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:30.596271 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" event={"ID":"1d3599da-edfb-4118-bf7e-9534da06b88b","Type":"ContainerStarted","Data":"822c2e34d64274e0ef7a73b5b9f4100cd1c4782685fb52e84318b358934af8a5"} Apr 22 21:16:30.596441 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:30.596419 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:30.613682 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:30.613618 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" podStartSLOduration=1.9529892439999998 podStartE2EDuration="4.613599385s" podCreationTimestamp="2026-04-22 21:16:26 +0000 UTC" firstStartedPulling="2026-04-22 21:16:27.221056936 +0000 UTC m=+444.577871666" lastFinishedPulling="2026-04-22 21:16:29.881667077 +0000 UTC m=+447.238481807" observedRunningTime="2026-04-22 21:16:30.612739225 +0000 UTC m=+447.969553978" watchObservedRunningTime="2026-04-22 21:16:30.613599385 +0000 UTC m=+447.970414138" Apr 22 21:16:41.601893 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:41.601860 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-65d8664856-2kn6j" Apr 22 21:16:47.809319 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.809288 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-rbgsq"] Apr 22 21:16:47.809760 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.809559 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38376534-3a94-45f5-998f-876c27c061fa" containerName="util" Apr 22 21:16:47.809760 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.809570 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="38376534-3a94-45f5-998f-876c27c061fa" containerName="util" Apr 22 21:16:47.809760 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.809585 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38376534-3a94-45f5-998f-876c27c061fa" containerName="extract" Apr 22 21:16:47.809760 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.809590 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="38376534-3a94-45f5-998f-876c27c061fa" containerName="extract" Apr 22 21:16:47.809760 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.809602 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38376534-3a94-45f5-998f-876c27c061fa" containerName="pull" Apr 22 21:16:47.809760 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.809608 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="38376534-3a94-45f5-998f-876c27c061fa" containerName="pull" Apr 22 21:16:47.809760 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.809646 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="38376534-3a94-45f5-998f-876c27c061fa" containerName="extract" Apr 22 21:16:47.811978 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.811963 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:47.813640 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.813620 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-dskfv\"" Apr 22 21:16:47.813728 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.813692 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 22 21:16:47.817811 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.817786 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-rbgsq"] Apr 22 21:16:47.933748 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.933710 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h68f9\" (UniqueName: \"kubernetes.io/projected/75d67504-5aa5-459b-8ec5-a68199bd0be6-kube-api-access-h68f9\") pod \"odh-model-controller-858dbf95b8-rbgsq\" (UID: \"75d67504-5aa5-459b-8ec5-a68199bd0be6\") " pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:47.933928 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:47.933756 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75d67504-5aa5-459b-8ec5-a68199bd0be6-cert\") pod \"odh-model-controller-858dbf95b8-rbgsq\" (UID: \"75d67504-5aa5-459b-8ec5-a68199bd0be6\") " pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:48.034307 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:48.034262 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75d67504-5aa5-459b-8ec5-a68199bd0be6-cert\") pod \"odh-model-controller-858dbf95b8-rbgsq\" (UID: \"75d67504-5aa5-459b-8ec5-a68199bd0be6\") " pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:48.034534 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:48.034368 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h68f9\" (UniqueName: \"kubernetes.io/projected/75d67504-5aa5-459b-8ec5-a68199bd0be6-kube-api-access-h68f9\") pod \"odh-model-controller-858dbf95b8-rbgsq\" (UID: \"75d67504-5aa5-459b-8ec5-a68199bd0be6\") " pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:48.034534 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:16:48.034463 2566 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 21:16:48.034534 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:16:48.034536 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75d67504-5aa5-459b-8ec5-a68199bd0be6-cert podName:75d67504-5aa5-459b-8ec5-a68199bd0be6 nodeName:}" failed. No retries permitted until 2026-04-22 21:16:48.534519581 +0000 UTC m=+465.891334315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75d67504-5aa5-459b-8ec5-a68199bd0be6-cert") pod "odh-model-controller-858dbf95b8-rbgsq" (UID: "75d67504-5aa5-459b-8ec5-a68199bd0be6") : secret "odh-model-controller-webhook-cert" not found Apr 22 21:16:48.042163 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:48.042139 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h68f9\" (UniqueName: \"kubernetes.io/projected/75d67504-5aa5-459b-8ec5-a68199bd0be6-kube-api-access-h68f9\") pod \"odh-model-controller-858dbf95b8-rbgsq\" (UID: \"75d67504-5aa5-459b-8ec5-a68199bd0be6\") " pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:48.538046 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:48.538008 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75d67504-5aa5-459b-8ec5-a68199bd0be6-cert\") pod \"odh-model-controller-858dbf95b8-rbgsq\" (UID: \"75d67504-5aa5-459b-8ec5-a68199bd0be6\") " pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:48.538225 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:16:48.538116 2566 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 21:16:48.538225 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:16:48.538180 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75d67504-5aa5-459b-8ec5-a68199bd0be6-cert podName:75d67504-5aa5-459b-8ec5-a68199bd0be6 nodeName:}" failed. No retries permitted until 2026-04-22 21:16:49.538165304 +0000 UTC m=+466.894980033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/75d67504-5aa5-459b-8ec5-a68199bd0be6-cert") pod "odh-model-controller-858dbf95b8-rbgsq" (UID: "75d67504-5aa5-459b-8ec5-a68199bd0be6") : secret "odh-model-controller-webhook-cert" not found Apr 22 21:16:49.545736 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:49.545695 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75d67504-5aa5-459b-8ec5-a68199bd0be6-cert\") pod \"odh-model-controller-858dbf95b8-rbgsq\" (UID: \"75d67504-5aa5-459b-8ec5-a68199bd0be6\") " pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:49.548242 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:49.548215 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/75d67504-5aa5-459b-8ec5-a68199bd0be6-cert\") pod \"odh-model-controller-858dbf95b8-rbgsq\" (UID: \"75d67504-5aa5-459b-8ec5-a68199bd0be6\") " pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:49.622233 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:49.622179 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:49.742873 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:49.742841 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-rbgsq"] Apr 22 21:16:49.745906 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:16:49.745868 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75d67504_5aa5_459b_8ec5_a68199bd0be6.slice/crio-d1cb608bcef5e9bf9eb9278dad8884a817ab658eaef723a6444178d6505ea420 WatchSource:0}: Error finding container d1cb608bcef5e9bf9eb9278dad8884a817ab658eaef723a6444178d6505ea420: Status 404 returned error can't find the container with id d1cb608bcef5e9bf9eb9278dad8884a817ab658eaef723a6444178d6505ea420 Apr 22 21:16:50.657484 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:50.657448 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" event={"ID":"75d67504-5aa5-459b-8ec5-a68199bd0be6","Type":"ContainerStarted","Data":"d1cb608bcef5e9bf9eb9278dad8884a817ab658eaef723a6444178d6505ea420"} Apr 22 21:16:52.665356 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:52.665323 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" event={"ID":"75d67504-5aa5-459b-8ec5-a68199bd0be6","Type":"ContainerStarted","Data":"fce46d618bee1d4e92ca1ae3abcbee23e73d926a383a0a0fa0714089ebe50695"} Apr 22 21:16:52.665694 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:52.665440 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:16:52.679315 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:52.679265 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" podStartSLOduration=2.826142863 podStartE2EDuration="5.679249842s" podCreationTimestamp="2026-04-22 21:16:47 +0000 UTC" firstStartedPulling="2026-04-22 21:16:49.747201295 +0000 UTC m=+467.104016026" lastFinishedPulling="2026-04-22 21:16:52.600308263 +0000 UTC m=+469.957123005" observedRunningTime="2026-04-22 21:16:52.678246318 +0000 UTC m=+470.035061069" watchObservedRunningTime="2026-04-22 21:16:52.679249842 +0000 UTC m=+470.036064594" Apr 22 21:16:52.826380 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:52.826344 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-stmtk"] Apr 22 21:16:52.828310 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:52.828293 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:16:52.830365 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:52.830302 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-zvzqb\"" Apr 22 21:16:52.830579 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:52.830560 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 22 21:16:52.845495 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:52.839217 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-stmtk"] Apr 22 21:16:52.975671 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:52.975577 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/591364e1-7e0f-4f15-8ee7-af576f419ab0-cert\") pod \"kserve-controller-manager-856948b99f-stmtk\" (UID: \"591364e1-7e0f-4f15-8ee7-af576f419ab0\") " pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:16:52.975671 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:52.975626 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scb82\" (UniqueName: \"kubernetes.io/projected/591364e1-7e0f-4f15-8ee7-af576f419ab0-kube-api-access-scb82\") pod \"kserve-controller-manager-856948b99f-stmtk\" (UID: \"591364e1-7e0f-4f15-8ee7-af576f419ab0\") " pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:16:53.077112 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:53.077067 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/591364e1-7e0f-4f15-8ee7-af576f419ab0-cert\") pod \"kserve-controller-manager-856948b99f-stmtk\" (UID: \"591364e1-7e0f-4f15-8ee7-af576f419ab0\") " pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:16:53.077277 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:53.077122 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scb82\" (UniqueName: \"kubernetes.io/projected/591364e1-7e0f-4f15-8ee7-af576f419ab0-kube-api-access-scb82\") pod \"kserve-controller-manager-856948b99f-stmtk\" (UID: \"591364e1-7e0f-4f15-8ee7-af576f419ab0\") " pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:16:53.079644 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:53.079610 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/591364e1-7e0f-4f15-8ee7-af576f419ab0-cert\") pod \"kserve-controller-manager-856948b99f-stmtk\" (UID: \"591364e1-7e0f-4f15-8ee7-af576f419ab0\") " pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:16:53.087062 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:53.087035 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scb82\" (UniqueName: \"kubernetes.io/projected/591364e1-7e0f-4f15-8ee7-af576f419ab0-kube-api-access-scb82\") pod \"kserve-controller-manager-856948b99f-stmtk\" (UID: \"591364e1-7e0f-4f15-8ee7-af576f419ab0\") " pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:16:53.145231 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:53.145197 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:16:53.267383 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:53.267354 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-stmtk"] Apr 22 21:16:53.269724 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:16:53.269696 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod591364e1_7e0f_4f15_8ee7_af576f419ab0.slice/crio-6ad045933ee078b65a4c46ea1fa766eb0116b2f1ecd4a16038a7a4e798646a65 WatchSource:0}: Error finding container 6ad045933ee078b65a4c46ea1fa766eb0116b2f1ecd4a16038a7a4e798646a65: Status 404 returned error can't find the container with id 6ad045933ee078b65a4c46ea1fa766eb0116b2f1ecd4a16038a7a4e798646a65 Apr 22 21:16:53.669099 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:53.669063 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" event={"ID":"591364e1-7e0f-4f15-8ee7-af576f419ab0","Type":"ContainerStarted","Data":"6ad045933ee078b65a4c46ea1fa766eb0116b2f1ecd4a16038a7a4e798646a65"} Apr 22 21:16:55.463428 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.463367 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97"] Apr 22 21:16:55.466985 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.466959 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:55.468972 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.468855 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 21:16:55.468972 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.468912 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 21:16:55.469159 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.469040 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 21:16:55.469159 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.469046 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 21:16:55.469595 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.469568 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-l964r\"" Apr 22 21:16:55.475449 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.475423 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97"] Apr 22 21:16:55.596446 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.596391 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c57abc-8177-4236-b215-e1fcf1fa366c-tls-certs\") pod \"kube-auth-proxy-7b8c5f7f67-hvd97\" (UID: \"d3c57abc-8177-4236-b215-e1fcf1fa366c\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:55.596628 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.596542 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3c57abc-8177-4236-b215-e1fcf1fa366c-tmp\") pod \"kube-auth-proxy-7b8c5f7f67-hvd97\" (UID: \"d3c57abc-8177-4236-b215-e1fcf1fa366c\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:55.596628 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.596576 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87bj4\" (UniqueName: \"kubernetes.io/projected/d3c57abc-8177-4236-b215-e1fcf1fa366c-kube-api-access-87bj4\") pod \"kube-auth-proxy-7b8c5f7f67-hvd97\" (UID: \"d3c57abc-8177-4236-b215-e1fcf1fa366c\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:55.697530 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.697492 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3c57abc-8177-4236-b215-e1fcf1fa366c-tmp\") pod \"kube-auth-proxy-7b8c5f7f67-hvd97\" (UID: \"d3c57abc-8177-4236-b215-e1fcf1fa366c\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:55.697709 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.697546 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87bj4\" (UniqueName: \"kubernetes.io/projected/d3c57abc-8177-4236-b215-e1fcf1fa366c-kube-api-access-87bj4\") pod \"kube-auth-proxy-7b8c5f7f67-hvd97\" (UID: \"d3c57abc-8177-4236-b215-e1fcf1fa366c\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:55.697709 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.697599 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c57abc-8177-4236-b215-e1fcf1fa366c-tls-certs\") pod \"kube-auth-proxy-7b8c5f7f67-hvd97\" (UID: \"d3c57abc-8177-4236-b215-e1fcf1fa366c\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:55.697709 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:16:55.697707 2566 secret.go:189] Couldn't get secret openshift-ingress/kube-auth-proxy-tls: secret "kube-auth-proxy-tls" not found Apr 22 21:16:55.697816 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:16:55.697756 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d3c57abc-8177-4236-b215-e1fcf1fa366c-tls-certs podName:d3c57abc-8177-4236-b215-e1fcf1fa366c nodeName:}" failed. No retries permitted until 2026-04-22 21:16:56.197740869 +0000 UTC m=+473.554555600 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/d3c57abc-8177-4236-b215-e1fcf1fa366c-tls-certs") pod "kube-auth-proxy-7b8c5f7f67-hvd97" (UID: "d3c57abc-8177-4236-b215-e1fcf1fa366c") : secret "kube-auth-proxy-tls" not found Apr 22 21:16:55.700158 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.700123 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d3c57abc-8177-4236-b215-e1fcf1fa366c-tmp\") pod \"kube-auth-proxy-7b8c5f7f67-hvd97\" (UID: \"d3c57abc-8177-4236-b215-e1fcf1fa366c\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:55.705030 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:55.705005 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87bj4\" (UniqueName: \"kubernetes.io/projected/d3c57abc-8177-4236-b215-e1fcf1fa366c-kube-api-access-87bj4\") pod \"kube-auth-proxy-7b8c5f7f67-hvd97\" (UID: \"d3c57abc-8177-4236-b215-e1fcf1fa366c\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:56.202453 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:56.202343 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c57abc-8177-4236-b215-e1fcf1fa366c-tls-certs\") pod \"kube-auth-proxy-7b8c5f7f67-hvd97\" (UID: \"d3c57abc-8177-4236-b215-e1fcf1fa366c\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:56.204958 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:56.204932 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d3c57abc-8177-4236-b215-e1fcf1fa366c-tls-certs\") pod \"kube-auth-proxy-7b8c5f7f67-hvd97\" (UID: \"d3c57abc-8177-4236-b215-e1fcf1fa366c\") " pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:56.380214 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:56.380177 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" Apr 22 21:16:56.498444 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:56.498256 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97"] Apr 22 21:16:56.500903 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:16:56.500874 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c57abc_8177_4236_b215_e1fcf1fa366c.slice/crio-b32db64258623ac86a158c96e3c6d0ce7fd4aaa9dda78afdf8cc3400c6051c1f WatchSource:0}: Error finding container b32db64258623ac86a158c96e3c6d0ce7fd4aaa9dda78afdf8cc3400c6051c1f: Status 404 returned error can't find the container with id b32db64258623ac86a158c96e3c6d0ce7fd4aaa9dda78afdf8cc3400c6051c1f Apr 22 21:16:56.680231 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:56.680194 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" event={"ID":"591364e1-7e0f-4f15-8ee7-af576f419ab0","Type":"ContainerStarted","Data":"37ca1c3f69e50ff7ffbeb744d4e686c8998c367a2fabf18b6da01be6763831c7"} Apr 22 21:16:56.680447 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:56.680296 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:16:56.681270 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:56.681243 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" event={"ID":"d3c57abc-8177-4236-b215-e1fcf1fa366c","Type":"ContainerStarted","Data":"b32db64258623ac86a158c96e3c6d0ce7fd4aaa9dda78afdf8cc3400c6051c1f"} Apr 22 21:16:56.695691 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:56.695649 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" podStartSLOduration=2.131465527 podStartE2EDuration="4.69563805s" podCreationTimestamp="2026-04-22 21:16:52 +0000 UTC" firstStartedPulling="2026-04-22 21:16:53.270972416 +0000 UTC m=+470.627787149" lastFinishedPulling="2026-04-22 21:16:55.835144942 +0000 UTC m=+473.191959672" observedRunningTime="2026-04-22 21:16:56.695596722 +0000 UTC m=+474.052411473" watchObservedRunningTime="2026-04-22 21:16:56.69563805 +0000 UTC m=+474.052452803" Apr 22 21:16:57.084458 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.084420 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8"] Apr 22 21:16:57.087968 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.087938 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.090193 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.090157 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:16:57.090314 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.090212 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bx54h\"" Apr 22 21:16:57.090314 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.090159 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:16:57.093855 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.093829 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8"] Apr 22 21:16:57.212511 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.212472 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.212730 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.212589 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g98c\" (UniqueName: \"kubernetes.io/projected/a4824516-e2bc-41e6-9069-2b4732c413eb-kube-api-access-5g98c\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.212730 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.212648 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.313316 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.313275 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.313523 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.313377 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g98c\" (UniqueName: \"kubernetes.io/projected/a4824516-e2bc-41e6-9069-2b4732c413eb-kube-api-access-5g98c\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.313523 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.313433 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.313795 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.313773 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.313868 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.313842 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.321352 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.321325 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g98c\" (UniqueName: \"kubernetes.io/projected/a4824516-e2bc-41e6-9069-2b4732c413eb-kube-api-access-5g98c\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.400379 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.400298 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:16:57.556199 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.556144 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8"] Apr 22 21:16:57.559768 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:16:57.559737 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4824516_e2bc_41e6_9069_2b4732c413eb.slice/crio-d1c09afe5ecda2518bd772f3e9ca13bc35814d917d13e48652a20f34f10a3a31 WatchSource:0}: Error finding container d1c09afe5ecda2518bd772f3e9ca13bc35814d917d13e48652a20f34f10a3a31: Status 404 returned error can't find the container with id d1c09afe5ecda2518bd772f3e9ca13bc35814d917d13e48652a20f34f10a3a31 Apr 22 21:16:57.687267 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.687228 2566 generic.go:358] "Generic (PLEG): container finished" podID="a4824516-e2bc-41e6-9069-2b4732c413eb" containerID="014634d4d471dd818f94c8dc3b87dc3cd59acb3115ef8309d3dc144157422399" exitCode=0 Apr 22 21:16:57.687467 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.687318 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" event={"ID":"a4824516-e2bc-41e6-9069-2b4732c413eb","Type":"ContainerDied","Data":"014634d4d471dd818f94c8dc3b87dc3cd59acb3115ef8309d3dc144157422399"} Apr 22 21:16:57.687467 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:57.687365 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" event={"ID":"a4824516-e2bc-41e6-9069-2b4732c413eb","Type":"ContainerStarted","Data":"d1c09afe5ecda2518bd772f3e9ca13bc35814d917d13e48652a20f34f10a3a31"} Apr 22 21:16:59.694791 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:59.694754 2566 generic.go:358] "Generic (PLEG): container finished" podID="a4824516-e2bc-41e6-9069-2b4732c413eb" containerID="43a7f1f4a2477fb3ba164284e313da883db121142f2be49f919aab361800d2a4" exitCode=0 Apr 22 21:16:59.695171 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:59.694794 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" event={"ID":"a4824516-e2bc-41e6-9069-2b4732c413eb","Type":"ContainerDied","Data":"43a7f1f4a2477fb3ba164284e313da883db121142f2be49f919aab361800d2a4"} Apr 22 21:16:59.696313 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:59.696287 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" event={"ID":"d3c57abc-8177-4236-b215-e1fcf1fa366c","Type":"ContainerStarted","Data":"9d1f35c699d5c582c0de0effb135772c46f24058b0da4db57feb15843fa59140"} Apr 22 21:16:59.725056 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:16:59.725002 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7b8c5f7f67-hvd97" podStartSLOduration=1.630928087 podStartE2EDuration="4.724985938s" podCreationTimestamp="2026-04-22 21:16:55 +0000 UTC" firstStartedPulling="2026-04-22 21:16:56.502679547 +0000 UTC m=+473.859494277" lastFinishedPulling="2026-04-22 21:16:59.596737399 +0000 UTC m=+476.953552128" observedRunningTime="2026-04-22 21:16:59.72335295 +0000 UTC m=+477.080167715" watchObservedRunningTime="2026-04-22 21:16:59.724985938 +0000 UTC m=+477.081800691" Apr 22 21:17:00.701660 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:00.701625 2566 generic.go:358] "Generic (PLEG): container finished" podID="a4824516-e2bc-41e6-9069-2b4732c413eb" containerID="9b51757ed3a5dbac6ddd82de13f596816bcacdc414c1f350228e6d78a1949936" exitCode=0 Apr 22 21:17:00.702027 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:00.701701 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" event={"ID":"a4824516-e2bc-41e6-9069-2b4732c413eb","Type":"ContainerDied","Data":"9b51757ed3a5dbac6ddd82de13f596816bcacdc414c1f350228e6d78a1949936"} Apr 22 21:17:01.824024 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:01.823997 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:17:01.950123 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:01.950079 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g98c\" (UniqueName: \"kubernetes.io/projected/a4824516-e2bc-41e6-9069-2b4732c413eb-kube-api-access-5g98c\") pod \"a4824516-e2bc-41e6-9069-2b4732c413eb\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " Apr 22 21:17:01.950308 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:01.950137 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-util\") pod \"a4824516-e2bc-41e6-9069-2b4732c413eb\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " Apr 22 21:17:01.950308 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:01.950217 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-bundle\") pod \"a4824516-e2bc-41e6-9069-2b4732c413eb\" (UID: \"a4824516-e2bc-41e6-9069-2b4732c413eb\") " Apr 22 21:17:01.951088 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:01.951061 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-bundle" (OuterVolumeSpecName: "bundle") pod "a4824516-e2bc-41e6-9069-2b4732c413eb" (UID: "a4824516-e2bc-41e6-9069-2b4732c413eb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:17:01.952226 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:01.952196 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4824516-e2bc-41e6-9069-2b4732c413eb-kube-api-access-5g98c" (OuterVolumeSpecName: "kube-api-access-5g98c") pod "a4824516-e2bc-41e6-9069-2b4732c413eb" (UID: "a4824516-e2bc-41e6-9069-2b4732c413eb"). InnerVolumeSpecName "kube-api-access-5g98c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:17:01.958281 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:01.958236 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-util" (OuterVolumeSpecName: "util") pod "a4824516-e2bc-41e6-9069-2b4732c413eb" (UID: "a4824516-e2bc-41e6-9069-2b4732c413eb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:17:02.051235 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:02.051198 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-bundle\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:17:02.051235 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:02.051231 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5g98c\" (UniqueName: \"kubernetes.io/projected/a4824516-e2bc-41e6-9069-2b4732c413eb-kube-api-access-5g98c\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:17:02.051442 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:02.051247 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a4824516-e2bc-41e6-9069-2b4732c413eb-util\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:17:02.710702 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:02.710663 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" Apr 22 21:17:02.710877 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:02.710663 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835km8r8" event={"ID":"a4824516-e2bc-41e6-9069-2b4732c413eb","Type":"ContainerDied","Data":"d1c09afe5ecda2518bd772f3e9ca13bc35814d917d13e48652a20f34f10a3a31"} Apr 22 21:17:02.710877 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:02.710784 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1c09afe5ecda2518bd772f3e9ca13bc35814d917d13e48652a20f34f10a3a31" Apr 22 21:17:03.671895 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:03.671861 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-rbgsq" Apr 22 21:17:11.931984 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.931948 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq"] Apr 22 21:17:11.932335 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.932199 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4824516-e2bc-41e6-9069-2b4732c413eb" containerName="util" Apr 22 21:17:11.932335 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.932209 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4824516-e2bc-41e6-9069-2b4732c413eb" containerName="util" Apr 22 21:17:11.932335 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.932216 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4824516-e2bc-41e6-9069-2b4732c413eb" containerName="pull" Apr 22 21:17:11.932335 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.932222 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4824516-e2bc-41e6-9069-2b4732c413eb" containerName="pull" Apr 22 21:17:11.932335 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.932236 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4824516-e2bc-41e6-9069-2b4732c413eb" containerName="extract" Apr 22 21:17:11.932335 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.932242 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4824516-e2bc-41e6-9069-2b4732c413eb" containerName="extract" Apr 22 21:17:11.932335 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.932283 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4824516-e2bc-41e6-9069-2b4732c413eb" containerName="extract" Apr 22 21:17:11.941442 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.941395 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq"] Apr 22 21:17:11.941582 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.941561 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:11.943675 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.943653 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 21:17:11.943778 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.943688 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 21:17:11.944312 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:11.944290 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-bx54h\"" Apr 22 21:17:12.015543 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.015508 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:12.015690 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.015621 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q2wp\" (UniqueName: \"kubernetes.io/projected/69ea02df-009e-4322-9358-aafabf1e2fd9-kube-api-access-9q2wp\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:12.015690 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.015657 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:12.116261 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.116223 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q2wp\" (UniqueName: \"kubernetes.io/projected/69ea02df-009e-4322-9358-aafabf1e2fd9-kube-api-access-9q2wp\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:12.116439 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.116268 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:12.116439 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.116297 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:12.116657 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.116641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:12.116703 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.116677 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:12.128891 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.128861 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q2wp\" (UniqueName: \"kubernetes.io/projected/69ea02df-009e-4322-9358-aafabf1e2fd9-kube-api-access-9q2wp\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:12.253212 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.253137 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:12.379684 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.379653 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq"] Apr 22 21:17:12.382069 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:17:12.382041 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69ea02df_009e_4322_9358_aafabf1e2fd9.slice/crio-62905f7aed205d1771d9e6ac69d10abdfeb95f3510a359de1e5d37a14db7cb4d WatchSource:0}: Error finding container 62905f7aed205d1771d9e6ac69d10abdfeb95f3510a359de1e5d37a14db7cb4d: Status 404 returned error can't find the container with id 62905f7aed205d1771d9e6ac69d10abdfeb95f3510a359de1e5d37a14db7cb4d Apr 22 21:17:12.742191 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.742151 2566 generic.go:358] "Generic (PLEG): container finished" podID="69ea02df-009e-4322-9358-aafabf1e2fd9" containerID="19c102a81f12b8ee16e970a85f6834aea442b3fad46dc4024ad02ee0a90b466d" exitCode=0 Apr 22 21:17:12.742353 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.742234 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" event={"ID":"69ea02df-009e-4322-9358-aafabf1e2fd9","Type":"ContainerDied","Data":"19c102a81f12b8ee16e970a85f6834aea442b3fad46dc4024ad02ee0a90b466d"} Apr 22 21:17:12.742353 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:12.742265 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" event={"ID":"69ea02df-009e-4322-9358-aafabf1e2fd9","Type":"ContainerStarted","Data":"62905f7aed205d1771d9e6ac69d10abdfeb95f3510a359de1e5d37a14db7cb4d"} Apr 22 21:17:13.747117 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:13.747081 2566 generic.go:358] "Generic (PLEG): container finished" podID="69ea02df-009e-4322-9358-aafabf1e2fd9" containerID="a7ab755699015703fc40bfdc54edebf02f53df9e5d9b248cc9e69f3220205193" exitCode=0 Apr 22 21:17:13.747548 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:13.747155 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" event={"ID":"69ea02df-009e-4322-9358-aafabf1e2fd9","Type":"ContainerDied","Data":"a7ab755699015703fc40bfdc54edebf02f53df9e5d9b248cc9e69f3220205193"} Apr 22 21:17:13.920097 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:13.920012 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7"] Apr 22 21:17:13.923512 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:13.923489 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" Apr 22 21:17:13.926539 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:13.926510 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swsmh\" (UniqueName: \"kubernetes.io/projected/a983a88a-0e55-416c-8001-a5a5d48336d6-kube-api-access-swsmh\") pod \"servicemesh-operator3-55f49c5f94-l2kj7\" (UID: \"a983a88a-0e55-416c-8001-a5a5d48336d6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" Apr 22 21:17:13.926662 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:13.926557 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a983a88a-0e55-416c-8001-a5a5d48336d6-operator-config\") pod \"servicemesh-operator3-55f49c5f94-l2kj7\" (UID: \"a983a88a-0e55-416c-8001-a5a5d48336d6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" Apr 22 21:17:13.926662 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:13.926572 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-8br22\"" Apr 22 21:17:13.926662 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:13.926572 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 21:17:13.930305 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:13.930286 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 21:17:13.947020 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:13.946994 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7"] Apr 22 21:17:14.027141 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:14.027111 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swsmh\" (UniqueName: \"kubernetes.io/projected/a983a88a-0e55-416c-8001-a5a5d48336d6-kube-api-access-swsmh\") pod \"servicemesh-operator3-55f49c5f94-l2kj7\" (UID: \"a983a88a-0e55-416c-8001-a5a5d48336d6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" Apr 22 21:17:14.027239 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:14.027159 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a983a88a-0e55-416c-8001-a5a5d48336d6-operator-config\") pod \"servicemesh-operator3-55f49c5f94-l2kj7\" (UID: \"a983a88a-0e55-416c-8001-a5a5d48336d6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" Apr 22 21:17:14.029393 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:14.029375 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/a983a88a-0e55-416c-8001-a5a5d48336d6-operator-config\") pod \"servicemesh-operator3-55f49c5f94-l2kj7\" (UID: \"a983a88a-0e55-416c-8001-a5a5d48336d6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" Apr 22 21:17:14.048159 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:14.048135 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swsmh\" (UniqueName: \"kubernetes.io/projected/a983a88a-0e55-416c-8001-a5a5d48336d6-kube-api-access-swsmh\") pod \"servicemesh-operator3-55f49c5f94-l2kj7\" (UID: \"a983a88a-0e55-416c-8001-a5a5d48336d6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" Apr 22 21:17:14.233148 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:14.233071 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" Apr 22 21:17:14.354084 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:14.354061 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7"] Apr 22 21:17:14.356910 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:17:14.356882 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda983a88a_0e55_416c_8001_a5a5d48336d6.slice/crio-ad440d175e32b2021dcec7de1566db06243f142a2c0a9e3c026c1092c7bfe125 WatchSource:0}: Error finding container ad440d175e32b2021dcec7de1566db06243f142a2c0a9e3c026c1092c7bfe125: Status 404 returned error can't find the container with id ad440d175e32b2021dcec7de1566db06243f142a2c0a9e3c026c1092c7bfe125 Apr 22 21:17:14.752714 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:14.752678 2566 generic.go:358] "Generic (PLEG): container finished" podID="69ea02df-009e-4322-9358-aafabf1e2fd9" containerID="98996d583f5e1ef7b3101a96f94ef3a19bbda7c0464b53770097c9dc5365f955" exitCode=0 Apr 22 21:17:14.753144 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:14.752761 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" event={"ID":"69ea02df-009e-4322-9358-aafabf1e2fd9","Type":"ContainerDied","Data":"98996d583f5e1ef7b3101a96f94ef3a19bbda7c0464b53770097c9dc5365f955"} Apr 22 21:17:14.753922 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:14.753901 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" event={"ID":"a983a88a-0e55-416c-8001-a5a5d48336d6","Type":"ContainerStarted","Data":"ad440d175e32b2021dcec7de1566db06243f142a2c0a9e3c026c1092c7bfe125"} Apr 22 21:17:16.600505 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.600483 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:16.645571 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.645548 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-util\") pod \"69ea02df-009e-4322-9358-aafabf1e2fd9\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " Apr 22 21:17:16.645652 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.645599 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q2wp\" (UniqueName: \"kubernetes.io/projected/69ea02df-009e-4322-9358-aafabf1e2fd9-kube-api-access-9q2wp\") pod \"69ea02df-009e-4322-9358-aafabf1e2fd9\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " Apr 22 21:17:16.645746 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.645665 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-bundle\") pod \"69ea02df-009e-4322-9358-aafabf1e2fd9\" (UID: \"69ea02df-009e-4322-9358-aafabf1e2fd9\") " Apr 22 21:17:16.646500 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.646472 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-bundle" (OuterVolumeSpecName: "bundle") pod "69ea02df-009e-4322-9358-aafabf1e2fd9" (UID: "69ea02df-009e-4322-9358-aafabf1e2fd9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:17:16.647835 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.647808 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ea02df-009e-4322-9358-aafabf1e2fd9-kube-api-access-9q2wp" (OuterVolumeSpecName: "kube-api-access-9q2wp") pod "69ea02df-009e-4322-9358-aafabf1e2fd9" (UID: "69ea02df-009e-4322-9358-aafabf1e2fd9"). InnerVolumeSpecName "kube-api-access-9q2wp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:17:16.651108 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.651082 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-util" (OuterVolumeSpecName: "util") pod "69ea02df-009e-4322-9358-aafabf1e2fd9" (UID: "69ea02df-009e-4322-9358-aafabf1e2fd9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:17:16.746857 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.746811 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-bundle\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:17:16.746857 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.746847 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69ea02df-009e-4322-9358-aafabf1e2fd9-util\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:17:16.746857 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.746857 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9q2wp\" (UniqueName: \"kubernetes.io/projected/69ea02df-009e-4322-9358-aafabf1e2fd9-kube-api-access-9q2wp\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:17:16.762727 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.762698 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" Apr 22 21:17:16.762898 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.762698 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebddjcq" event={"ID":"69ea02df-009e-4322-9358-aafabf1e2fd9","Type":"ContainerDied","Data":"62905f7aed205d1771d9e6ac69d10abdfeb95f3510a359de1e5d37a14db7cb4d"} Apr 22 21:17:16.762898 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.762832 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62905f7aed205d1771d9e6ac69d10abdfeb95f3510a359de1e5d37a14db7cb4d" Apr 22 21:17:16.764195 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.764175 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" event={"ID":"a983a88a-0e55-416c-8001-a5a5d48336d6","Type":"ContainerStarted","Data":"67cea5c4e025ac61ed70acd567c312685a710d2250f2d7e9fdef733783513edc"} Apr 22 21:17:16.764368 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.764355 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" Apr 22 21:17:16.782634 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:16.782591 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" podStartSLOduration=1.495563402 podStartE2EDuration="3.782578265s" podCreationTimestamp="2026-04-22 21:17:13 +0000 UTC" firstStartedPulling="2026-04-22 21:17:14.359963517 +0000 UTC m=+491.716778250" lastFinishedPulling="2026-04-22 21:17:16.646978376 +0000 UTC m=+494.003793113" observedRunningTime="2026-04-22 21:17:16.78033039 +0000 UTC m=+494.137145141" watchObservedRunningTime="2026-04-22 21:17:16.782578265 +0000 UTC m=+494.139393016" Apr 22 21:17:24.770023 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.769984 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff"] Apr 22 21:17:24.770384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.770334 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69ea02df-009e-4322-9358-aafabf1e2fd9" containerName="extract" Apr 22 21:17:24.770384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.770347 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ea02df-009e-4322-9358-aafabf1e2fd9" containerName="extract" Apr 22 21:17:24.770384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.770374 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69ea02df-009e-4322-9358-aafabf1e2fd9" containerName="pull" Apr 22 21:17:24.770384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.770382 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ea02df-009e-4322-9358-aafabf1e2fd9" containerName="pull" Apr 22 21:17:24.770549 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.770456 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69ea02df-009e-4322-9358-aafabf1e2fd9" containerName="util" Apr 22 21:17:24.770549 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.770465 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ea02df-009e-4322-9358-aafabf1e2fd9" containerName="util" Apr 22 21:17:24.770549 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.770537 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="69ea02df-009e-4322-9358-aafabf1e2fd9" containerName="extract" Apr 22 21:17:24.773635 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.773615 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.775464 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.775437 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 21:17:24.775668 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.775654 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 21:17:24.775753 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.775732 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-dkk9n\"" Apr 22 21:17:24.775801 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.775775 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 21:17:24.775839 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.775817 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 21:17:24.781319 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.781294 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff"] Apr 22 21:17:24.805384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.805355 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/16586da0-c3ba-4397-8b50-5817ee975d70-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.805516 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.805387 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16586da0-c3ba-4397-8b50-5817ee975d70-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.805516 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.805430 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/16586da0-c3ba-4397-8b50-5817ee975d70-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.805516 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.805487 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16586da0-c3ba-4397-8b50-5817ee975d70-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.805637 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.805528 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsl2x\" (UniqueName: \"kubernetes.io/projected/16586da0-c3ba-4397-8b50-5817ee975d70-kube-api-access-gsl2x\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.805637 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.805572 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/16586da0-c3ba-4397-8b50-5817ee975d70-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.805637 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.805596 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/16586da0-c3ba-4397-8b50-5817ee975d70-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.906389 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.906356 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16586da0-c3ba-4397-8b50-5817ee975d70-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.906389 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.906393 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/16586da0-c3ba-4397-8b50-5817ee975d70-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.906572 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.906450 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16586da0-c3ba-4397-8b50-5817ee975d70-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.906572 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.906501 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsl2x\" (UniqueName: \"kubernetes.io/projected/16586da0-c3ba-4397-8b50-5817ee975d70-kube-api-access-gsl2x\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.906572 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.906547 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/16586da0-c3ba-4397-8b50-5817ee975d70-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.906749 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.906571 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/16586da0-c3ba-4397-8b50-5817ee975d70-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.906749 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.906619 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/16586da0-c3ba-4397-8b50-5817ee975d70-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.907207 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.907175 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/16586da0-c3ba-4397-8b50-5817ee975d70-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.908873 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.908844 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/16586da0-c3ba-4397-8b50-5817ee975d70-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.909001 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.908856 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/16586da0-c3ba-4397-8b50-5817ee975d70-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.909001 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.908929 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/16586da0-c3ba-4397-8b50-5817ee975d70-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.909001 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.908982 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/16586da0-c3ba-4397-8b50-5817ee975d70-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.913514 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.913490 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsl2x\" (UniqueName: \"kubernetes.io/projected/16586da0-c3ba-4397-8b50-5817ee975d70-kube-api-access-gsl2x\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:24.913611 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:24.913554 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/16586da0-c3ba-4397-8b50-5817ee975d70-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-rldff\" (UID: \"16586da0-c3ba-4397-8b50-5817ee975d70\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:25.083151 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:25.083108 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:25.209868 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:25.209841 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff"] Apr 22 21:17:25.212178 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:17:25.212150 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16586da0_c3ba_4397_8b50_5817ee975d70.slice/crio-262299cd2fb11153f9ee664fee0b8a192c54ce96b4b05934b197f1f3f4b31c6d WatchSource:0}: Error finding container 262299cd2fb11153f9ee664fee0b8a192c54ce96b4b05934b197f1f3f4b31c6d: Status 404 returned error can't find the container with id 262299cd2fb11153f9ee664fee0b8a192c54ce96b4b05934b197f1f3f4b31c6d Apr 22 21:17:25.795341 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:25.795307 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" event={"ID":"16586da0-c3ba-4397-8b50-5817ee975d70","Type":"ContainerStarted","Data":"262299cd2fb11153f9ee664fee0b8a192c54ce96b4b05934b197f1f3f4b31c6d"} Apr 22 21:17:27.693259 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:27.693223 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:17:27.770253 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:27.770221 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-l2kj7" Apr 22 21:17:27.846708 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:27.846654 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 21:17:27.846823 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:27.846738 2566 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 21:17:28.808091 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:28.808058 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" event={"ID":"16586da0-c3ba-4397-8b50-5817ee975d70","Type":"ContainerStarted","Data":"a00571346d61ea6e3edd8de709c041c63884710287baf430d661fea406ca6678"} Apr 22 21:17:28.808584 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:28.808275 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:17:28.809810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:28.809784 2566 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-rldff container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 22 21:17:28.809925 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:28.809832 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" podUID="16586da0-c3ba-4397-8b50-5817ee975d70" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 21:17:28.857057 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:28.857000 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" podStartSLOduration=2.22495376 podStartE2EDuration="4.856985119s" podCreationTimestamp="2026-04-22 21:17:24 +0000 UTC" firstStartedPulling="2026-04-22 21:17:25.214368773 +0000 UTC m=+502.571183506" lastFinishedPulling="2026-04-22 21:17:27.846400119 +0000 UTC m=+505.203214865" observedRunningTime="2026-04-22 21:17:28.855574766 +0000 UTC m=+506.212389544" watchObservedRunningTime="2026-04-22 21:17:28.856985119 +0000 UTC m=+506.213799871" Apr 22 21:17:29.811651 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:17:29.811623 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-rldff" Apr 22 21:18:00.503242 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.503152 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d"] Apr 22 21:18:00.508819 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.508799 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.511262 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.511236 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 21:18:00.511532 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.511232 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5sldk\"" Apr 22 21:18:00.511532 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.511484 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 21:18:00.513084 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.513060 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d"] Apr 22 21:18:00.606946 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.606909 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.607133 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.606967 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stmhx\" (UniqueName: \"kubernetes.io/projected/f85a7385-6408-446c-9911-6446129baf21-kube-api-access-stmhx\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.607133 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.607059 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.708278 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.708238 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stmhx\" (UniqueName: \"kubernetes.io/projected/f85a7385-6408-446c-9911-6446129baf21-kube-api-access-stmhx\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.708483 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.708284 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.708483 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.708343 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.708728 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.708709 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.708766 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.708729 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.716040 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.716014 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stmhx\" (UniqueName: \"kubernetes.io/projected/f85a7385-6408-446c-9911-6446129baf21-kube-api-access-stmhx\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.818890 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.818854 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:00.942140 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:00.942107 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d"] Apr 22 21:18:00.945298 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:18:00.945270 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85a7385_6408_446c_9911_6446129baf21.slice/crio-bc0c9702041d94d7a342d67d9d939c9a76d98075c2264106d307849b93d6c34d WatchSource:0}: Error finding container bc0c9702041d94d7a342d67d9d939c9a76d98075c2264106d307849b93d6c34d: Status 404 returned error can't find the container with id bc0c9702041d94d7a342d67d9d939c9a76d98075c2264106d307849b93d6c34d Apr 22 21:18:01.101691 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.101608 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5"] Apr 22 21:18:01.104762 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.104746 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.110397 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.110369 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5"] Apr 22 21:18:01.211615 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.211569 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2w72\" (UniqueName: \"kubernetes.io/projected/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-kube-api-access-w2w72\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.211798 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.211674 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.211798 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.211710 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.313058 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.313008 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2w72\" (UniqueName: \"kubernetes.io/projected/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-kube-api-access-w2w72\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.313236 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.313076 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.313236 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.313107 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.313563 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.313543 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.313563 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.313556 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.320668 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.320644 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2w72\" (UniqueName: \"kubernetes.io/projected/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-kube-api-access-w2w72\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.415723 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.415632 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:01.564075 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.563996 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5"] Apr 22 21:18:01.698029 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.697948 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v"] Apr 22 21:18:01.701270 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.701251 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:01.708542 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.708515 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v"] Apr 22 21:18:01.825511 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.825464 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:01.825698 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.825548 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:01.825698 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.825599 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69gqf\" (UniqueName: \"kubernetes.io/projected/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-kube-api-access-69gqf\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:01.917316 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.917282 2566 generic.go:358] "Generic (PLEG): container finished" podID="f85a7385-6408-446c-9911-6446129baf21" containerID="a125aecbc458554538749022c6d679ff14a2c03366dff93f640da4877959041a" exitCode=0 Apr 22 21:18:01.917550 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.917369 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" event={"ID":"f85a7385-6408-446c-9911-6446129baf21","Type":"ContainerDied","Data":"a125aecbc458554538749022c6d679ff14a2c03366dff93f640da4877959041a"} Apr 22 21:18:01.917550 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.917428 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" event={"ID":"f85a7385-6408-446c-9911-6446129baf21","Type":"ContainerStarted","Data":"bc0c9702041d94d7a342d67d9d939c9a76d98075c2264106d307849b93d6c34d"} Apr 22 21:18:01.918962 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.918941 2566 generic.go:358] "Generic (PLEG): container finished" podID="bc060510-4b85-48e8-8bb0-4a4d470d8cc2" containerID="be8ed3679035b0a1e57d3e72985359a9a779160a5c3e3c49bdbd169dde1504a4" exitCode=0 Apr 22 21:18:01.919058 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.919028 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" event={"ID":"bc060510-4b85-48e8-8bb0-4a4d470d8cc2","Type":"ContainerDied","Data":"be8ed3679035b0a1e57d3e72985359a9a779160a5c3e3c49bdbd169dde1504a4"} Apr 22 21:18:01.919111 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.919066 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" event={"ID":"bc060510-4b85-48e8-8bb0-4a4d470d8cc2","Type":"ContainerStarted","Data":"e5e8ec6cb1abc9b1ed423d3e34f4dc5be5378fa6db88d5ed7af4cba90c654f55"} Apr 22 21:18:01.926948 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.926916 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:01.927070 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.926970 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69gqf\" (UniqueName: \"kubernetes.io/projected/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-kube-api-access-69gqf\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:01.927070 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.926993 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:01.927343 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.927320 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:01.927343 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.927336 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:01.935613 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:01.935588 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69gqf\" (UniqueName: \"kubernetes.io/projected/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-kube-api-access-69gqf\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:02.012508 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.012423 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:02.104050 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.104012 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw"] Apr 22 21:18:02.109058 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.109028 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.112786 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.112756 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw"] Apr 22 21:18:02.138315 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.138289 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v"] Apr 22 21:18:02.140851 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:18:02.140818 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbab93fad_8ef3_4d6c_baf4_8a9a2431581f.slice/crio-d1e989e9d3b51e8674448abeca63603a8e2e0e4c9cdc01c56b9db64ea8ddff58 WatchSource:0}: Error finding container d1e989e9d3b51e8674448abeca63603a8e2e0e4c9cdc01c56b9db64ea8ddff58: Status 404 returned error can't find the container with id d1e989e9d3b51e8674448abeca63603a8e2e0e4c9cdc01c56b9db64ea8ddff58 Apr 22 21:18:02.229257 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.229212 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsz72\" (UniqueName: \"kubernetes.io/projected/221ac19a-80b1-431f-9b74-31be7bb3ab98-kube-api-access-fsz72\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.229463 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.229276 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.229463 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.229324 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.329977 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.329934 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsz72\" (UniqueName: \"kubernetes.io/projected/221ac19a-80b1-431f-9b74-31be7bb3ab98-kube-api-access-fsz72\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.330187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.329991 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.330187 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.330015 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.330504 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.330480 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.330556 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.330501 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.337352 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.337322 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsz72\" (UniqueName: \"kubernetes.io/projected/221ac19a-80b1-431f-9b74-31be7bb3ab98-kube-api-access-fsz72\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.421349 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.421313 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:02.550182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.550137 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw"] Apr 22 21:18:02.553227 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:18:02.553192 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod221ac19a_80b1_431f_9b74_31be7bb3ab98.slice/crio-589697a32bc7a12314159f38cd96f921ac3d1f5ebbaec13c4ce593ce71a779d9 WatchSource:0}: Error finding container 589697a32bc7a12314159f38cd96f921ac3d1f5ebbaec13c4ce593ce71a779d9: Status 404 returned error can't find the container with id 589697a32bc7a12314159f38cd96f921ac3d1f5ebbaec13c4ce593ce71a779d9 Apr 22 21:18:02.925109 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.925069 2566 generic.go:358] "Generic (PLEG): container finished" podID="bc060510-4b85-48e8-8bb0-4a4d470d8cc2" containerID="48adbd14c3cfc0f8523b5b464e98215089afda5cd8aa15c2b55ada5ba88b7c81" exitCode=0 Apr 22 21:18:02.925580 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.925163 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" event={"ID":"bc060510-4b85-48e8-8bb0-4a4d470d8cc2","Type":"ContainerDied","Data":"48adbd14c3cfc0f8523b5b464e98215089afda5cd8aa15c2b55ada5ba88b7c81"} Apr 22 21:18:02.926635 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.926601 2566 generic.go:358] "Generic (PLEG): container finished" podID="221ac19a-80b1-431f-9b74-31be7bb3ab98" containerID="774000c66d0971dab28c2cb125dbde5ec11d7e9d60acd53fb589874bd2568d60" exitCode=0 Apr 22 21:18:02.926748 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.926680 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" event={"ID":"221ac19a-80b1-431f-9b74-31be7bb3ab98","Type":"ContainerDied","Data":"774000c66d0971dab28c2cb125dbde5ec11d7e9d60acd53fb589874bd2568d60"} Apr 22 21:18:02.926748 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.926717 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" event={"ID":"221ac19a-80b1-431f-9b74-31be7bb3ab98","Type":"ContainerStarted","Data":"589697a32bc7a12314159f38cd96f921ac3d1f5ebbaec13c4ce593ce71a779d9"} Apr 22 21:18:02.928236 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.928215 2566 generic.go:358] "Generic (PLEG): container finished" podID="bab93fad-8ef3-4d6c-baf4-8a9a2431581f" containerID="372f394af8525cc5bd93e4b2c7f7b5c1daeed28e207d0c8d58e0afb0cb4295ca" exitCode=0 Apr 22 21:18:02.928349 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.928313 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" event={"ID":"bab93fad-8ef3-4d6c-baf4-8a9a2431581f","Type":"ContainerDied","Data":"372f394af8525cc5bd93e4b2c7f7b5c1daeed28e207d0c8d58e0afb0cb4295ca"} Apr 22 21:18:02.928349 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.928344 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" event={"ID":"bab93fad-8ef3-4d6c-baf4-8a9a2431581f","Type":"ContainerStarted","Data":"d1e989e9d3b51e8674448abeca63603a8e2e0e4c9cdc01c56b9db64ea8ddff58"} Apr 22 21:18:02.930066 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.930044 2566 generic.go:358] "Generic (PLEG): container finished" podID="f85a7385-6408-446c-9911-6446129baf21" containerID="861b235ff4b793d5b55b50840c9399c76adb17f286cd5e0a50948bd0bc5e0c6a" exitCode=0 Apr 22 21:18:02.930160 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:02.930125 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" event={"ID":"f85a7385-6408-446c-9911-6446129baf21","Type":"ContainerDied","Data":"861b235ff4b793d5b55b50840c9399c76adb17f286cd5e0a50948bd0bc5e0c6a"} Apr 22 21:18:03.935025 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:03.934948 2566 generic.go:358] "Generic (PLEG): container finished" podID="bab93fad-8ef3-4d6c-baf4-8a9a2431581f" containerID="77dc2de67cbf08915aa85c1b538e7d5226f2b534c3528b629a64b5fceda2eb20" exitCode=0 Apr 22 21:18:03.935430 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:03.935037 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" event={"ID":"bab93fad-8ef3-4d6c-baf4-8a9a2431581f","Type":"ContainerDied","Data":"77dc2de67cbf08915aa85c1b538e7d5226f2b534c3528b629a64b5fceda2eb20"} Apr 22 21:18:03.936957 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:03.936937 2566 generic.go:358] "Generic (PLEG): container finished" podID="f85a7385-6408-446c-9911-6446129baf21" containerID="038d16822e5997782b9356cce3ee28f46d5984852c535b1d5fb035f3b70d72ac" exitCode=0 Apr 22 21:18:03.937035 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:03.937017 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" event={"ID":"f85a7385-6408-446c-9911-6446129baf21","Type":"ContainerDied","Data":"038d16822e5997782b9356cce3ee28f46d5984852c535b1d5fb035f3b70d72ac"} Apr 22 21:18:03.938938 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:03.938920 2566 generic.go:358] "Generic (PLEG): container finished" podID="bc060510-4b85-48e8-8bb0-4a4d470d8cc2" containerID="3d74d83855e22453543e37605647d47407e58581b147e2e1fd57930b3d9311e5" exitCode=0 Apr 22 21:18:03.939024 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:03.938996 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" event={"ID":"bc060510-4b85-48e8-8bb0-4a4d470d8cc2","Type":"ContainerDied","Data":"3d74d83855e22453543e37605647d47407e58581b147e2e1fd57930b3d9311e5"} Apr 22 21:18:03.940448 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:03.940431 2566 generic.go:358] "Generic (PLEG): container finished" podID="221ac19a-80b1-431f-9b74-31be7bb3ab98" containerID="16ecd15ebb7dd01a7d5eb4e71182f32d653de61d6d282c145fe2c3b53322ca91" exitCode=0 Apr 22 21:18:03.940510 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:03.940458 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" event={"ID":"221ac19a-80b1-431f-9b74-31be7bb3ab98","Type":"ContainerDied","Data":"16ecd15ebb7dd01a7d5eb4e71182f32d653de61d6d282c145fe2c3b53322ca91"} Apr 22 21:18:04.945726 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:04.945692 2566 generic.go:358] "Generic (PLEG): container finished" podID="221ac19a-80b1-431f-9b74-31be7bb3ab98" containerID="29e7100dce4fe1cc4b1bf7e2300d643d446296f5eec00caa20b3f3b1981a938f" exitCode=0 Apr 22 21:18:04.946123 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:04.945780 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" event={"ID":"221ac19a-80b1-431f-9b74-31be7bb3ab98","Type":"ContainerDied","Data":"29e7100dce4fe1cc4b1bf7e2300d643d446296f5eec00caa20b3f3b1981a938f"} Apr 22 21:18:04.947624 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:04.947598 2566 generic.go:358] "Generic (PLEG): container finished" podID="bab93fad-8ef3-4d6c-baf4-8a9a2431581f" containerID="90f2eea281f03d514a6c5200e31a175a1d2c32aae3bedd2750749deeacc7abda" exitCode=0 Apr 22 21:18:04.947715 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:04.947664 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" event={"ID":"bab93fad-8ef3-4d6c-baf4-8a9a2431581f","Type":"ContainerDied","Data":"90f2eea281f03d514a6c5200e31a175a1d2c32aae3bedd2750749deeacc7abda"} Apr 22 21:18:05.079112 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.079083 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:05.111495 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.111471 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:05.258344 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.258254 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stmhx\" (UniqueName: \"kubernetes.io/projected/f85a7385-6408-446c-9911-6446129baf21-kube-api-access-stmhx\") pod \"f85a7385-6408-446c-9911-6446129baf21\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " Apr 22 21:18:05.258344 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.258328 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-bundle\") pod \"f85a7385-6408-446c-9911-6446129baf21\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " Apr 22 21:18:05.258570 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.258353 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-util\") pod \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " Apr 22 21:18:05.258570 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.258474 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-bundle\") pod \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " Apr 22 21:18:05.258570 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.258548 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-util\") pod \"f85a7385-6408-446c-9911-6446129baf21\" (UID: \"f85a7385-6408-446c-9911-6446129baf21\") " Apr 22 21:18:05.258720 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.258572 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2w72\" (UniqueName: \"kubernetes.io/projected/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-kube-api-access-w2w72\") pod \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\" (UID: \"bc060510-4b85-48e8-8bb0-4a4d470d8cc2\") " Apr 22 21:18:05.259441 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.259007 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-bundle" (OuterVolumeSpecName: "bundle") pod "bc060510-4b85-48e8-8bb0-4a4d470d8cc2" (UID: "bc060510-4b85-48e8-8bb0-4a4d470d8cc2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:18:05.259441 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.259059 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-bundle" (OuterVolumeSpecName: "bundle") pod "f85a7385-6408-446c-9911-6446129baf21" (UID: "f85a7385-6408-446c-9911-6446129baf21"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:18:05.260874 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.260842 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f85a7385-6408-446c-9911-6446129baf21-kube-api-access-stmhx" (OuterVolumeSpecName: "kube-api-access-stmhx") pod "f85a7385-6408-446c-9911-6446129baf21" (UID: "f85a7385-6408-446c-9911-6446129baf21"). InnerVolumeSpecName "kube-api-access-stmhx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:18:05.261033 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.261010 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-kube-api-access-w2w72" (OuterVolumeSpecName: "kube-api-access-w2w72") pod "bc060510-4b85-48e8-8bb0-4a4d470d8cc2" (UID: "bc060510-4b85-48e8-8bb0-4a4d470d8cc2"). InnerVolumeSpecName "kube-api-access-w2w72". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:18:05.263401 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.263363 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-util" (OuterVolumeSpecName: "util") pod "f85a7385-6408-446c-9911-6446129baf21" (UID: "f85a7385-6408-446c-9911-6446129baf21"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:18:05.264004 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.263984 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-util" (OuterVolumeSpecName: "util") pod "bc060510-4b85-48e8-8bb0-4a4d470d8cc2" (UID: "bc060510-4b85-48e8-8bb0-4a4d470d8cc2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:18:05.360210 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.360176 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-util\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:05.360210 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.360207 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w2w72\" (UniqueName: \"kubernetes.io/projected/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-kube-api-access-w2w72\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:05.360210 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.360219 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-stmhx\" (UniqueName: \"kubernetes.io/projected/f85a7385-6408-446c-9911-6446129baf21-kube-api-access-stmhx\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:05.360443 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.360229 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f85a7385-6408-446c-9911-6446129baf21-bundle\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:05.360443 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.360238 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-util\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:05.360443 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.360245 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bc060510-4b85-48e8-8bb0-4a4d470d8cc2-bundle\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:05.953844 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.953814 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" Apr 22 21:18:05.953844 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.953827 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5" event={"ID":"bc060510-4b85-48e8-8bb0-4a4d470d8cc2","Type":"ContainerDied","Data":"e5e8ec6cb1abc9b1ed423d3e34f4dc5be5378fa6db88d5ed7af4cba90c654f55"} Apr 22 21:18:05.954306 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.953866 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5e8ec6cb1abc9b1ed423d3e34f4dc5be5378fa6db88d5ed7af4cba90c654f55" Apr 22 21:18:05.955587 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.955561 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" event={"ID":"f85a7385-6408-446c-9911-6446129baf21","Type":"ContainerDied","Data":"bc0c9702041d94d7a342d67d9d939c9a76d98075c2264106d307849b93d6c34d"} Apr 22 21:18:05.955587 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.955589 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc0c9702041d94d7a342d67d9d939c9a76d98075c2264106d307849b93d6c34d" Apr 22 21:18:05.955739 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:05.955593 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d" Apr 22 21:18:06.094022 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.093998 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:06.111033 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.111001 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:06.268732 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.268630 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-util\") pod \"221ac19a-80b1-431f-9b74-31be7bb3ab98\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " Apr 22 21:18:06.268732 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.268696 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-bundle\") pod \"221ac19a-80b1-431f-9b74-31be7bb3ab98\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " Apr 22 21:18:06.268732 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.268720 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-bundle\") pod \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " Apr 22 21:18:06.269030 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.268744 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69gqf\" (UniqueName: \"kubernetes.io/projected/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-kube-api-access-69gqf\") pod \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " Apr 22 21:18:06.269030 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.268766 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsz72\" (UniqueName: \"kubernetes.io/projected/221ac19a-80b1-431f-9b74-31be7bb3ab98-kube-api-access-fsz72\") pod \"221ac19a-80b1-431f-9b74-31be7bb3ab98\" (UID: \"221ac19a-80b1-431f-9b74-31be7bb3ab98\") " Apr 22 21:18:06.269030 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.268791 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-util\") pod \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\" (UID: \"bab93fad-8ef3-4d6c-baf4-8a9a2431581f\") " Apr 22 21:18:06.269306 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.269276 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-bundle" (OuterVolumeSpecName: "bundle") pod "bab93fad-8ef3-4d6c-baf4-8a9a2431581f" (UID: "bab93fad-8ef3-4d6c-baf4-8a9a2431581f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:18:06.269623 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.269597 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-bundle" (OuterVolumeSpecName: "bundle") pod "221ac19a-80b1-431f-9b74-31be7bb3ab98" (UID: "221ac19a-80b1-431f-9b74-31be7bb3ab98"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:18:06.271040 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.271015 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-kube-api-access-69gqf" (OuterVolumeSpecName: "kube-api-access-69gqf") pod "bab93fad-8ef3-4d6c-baf4-8a9a2431581f" (UID: "bab93fad-8ef3-4d6c-baf4-8a9a2431581f"). InnerVolumeSpecName "kube-api-access-69gqf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:18:06.271516 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.271496 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/221ac19a-80b1-431f-9b74-31be7bb3ab98-kube-api-access-fsz72" (OuterVolumeSpecName: "kube-api-access-fsz72") pod "221ac19a-80b1-431f-9b74-31be7bb3ab98" (UID: "221ac19a-80b1-431f-9b74-31be7bb3ab98"). InnerVolumeSpecName "kube-api-access-fsz72". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:18:06.273865 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.273812 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-util" (OuterVolumeSpecName: "util") pod "bab93fad-8ef3-4d6c-baf4-8a9a2431581f" (UID: "bab93fad-8ef3-4d6c-baf4-8a9a2431581f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:18:06.275150 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.275129 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-util" (OuterVolumeSpecName: "util") pod "221ac19a-80b1-431f-9b74-31be7bb3ab98" (UID: "221ac19a-80b1-431f-9b74-31be7bb3ab98"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:18:06.369928 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.369875 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-util\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:06.369928 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.369921 2566 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-util\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:06.370154 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.369952 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/221ac19a-80b1-431f-9b74-31be7bb3ab98-bundle\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:06.370154 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.369962 2566 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-bundle\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:06.370154 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.369972 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-69gqf\" (UniqueName: \"kubernetes.io/projected/bab93fad-8ef3-4d6c-baf4-8a9a2431581f-kube-api-access-69gqf\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:06.370154 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.369983 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fsz72\" (UniqueName: \"kubernetes.io/projected/221ac19a-80b1-431f-9b74-31be7bb3ab98-kube-api-access-fsz72\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:18:06.960282 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.960238 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" event={"ID":"221ac19a-80b1-431f-9b74-31be7bb3ab98","Type":"ContainerDied","Data":"589697a32bc7a12314159f38cd96f921ac3d1f5ebbaec13c4ce593ce71a779d9"} Apr 22 21:18:06.960282 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.960266 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw" Apr 22 21:18:06.960282 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.960277 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="589697a32bc7a12314159f38cd96f921ac3d1f5ebbaec13c4ce593ce71a779d9" Apr 22 21:18:06.961909 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.961882 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" event={"ID":"bab93fad-8ef3-4d6c-baf4-8a9a2431581f","Type":"ContainerDied","Data":"d1e989e9d3b51e8674448abeca63603a8e2e0e4c9cdc01c56b9db64ea8ddff58"} Apr 22 21:18:06.961909 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.961907 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1e989e9d3b51e8674448abeca63603a8e2e0e4c9cdc01c56b9db64ea8ddff58" Apr 22 21:18:06.962060 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:06.961907 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v" Apr 22 21:18:19.689759 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.689727 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-msf8d"] Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.689998 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221ac19a-80b1-431f-9b74-31be7bb3ab98" containerName="util" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690009 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="221ac19a-80b1-431f-9b74-31be7bb3ab98" containerName="util" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690016 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221ac19a-80b1-431f-9b74-31be7bb3ab98" containerName="extract" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690022 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="221ac19a-80b1-431f-9b74-31be7bb3ab98" containerName="extract" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690030 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f85a7385-6408-446c-9911-6446129baf21" containerName="pull" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690035 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85a7385-6408-446c-9911-6446129baf21" containerName="pull" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690041 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc060510-4b85-48e8-8bb0-4a4d470d8cc2" containerName="extract" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690048 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc060510-4b85-48e8-8bb0-4a4d470d8cc2" containerName="extract" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690061 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bab93fad-8ef3-4d6c-baf4-8a9a2431581f" containerName="extract" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690066 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab93fad-8ef3-4d6c-baf4-8a9a2431581f" containerName="extract" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690073 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc060510-4b85-48e8-8bb0-4a4d470d8cc2" containerName="pull" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690078 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc060510-4b85-48e8-8bb0-4a4d470d8cc2" containerName="pull" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690084 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f85a7385-6408-446c-9911-6446129baf21" containerName="extract" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690089 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85a7385-6408-446c-9911-6446129baf21" containerName="extract" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690094 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="221ac19a-80b1-431f-9b74-31be7bb3ab98" containerName="pull" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690101 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="221ac19a-80b1-431f-9b74-31be7bb3ab98" containerName="pull" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690107 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bab93fad-8ef3-4d6c-baf4-8a9a2431581f" containerName="util" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690112 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab93fad-8ef3-4d6c-baf4-8a9a2431581f" containerName="util" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690118 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f85a7385-6408-446c-9911-6446129baf21" containerName="util" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690123 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85a7385-6408-446c-9911-6446129baf21" containerName="util" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690132 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bab93fad-8ef3-4d6c-baf4-8a9a2431581f" containerName="pull" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690139 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab93fad-8ef3-4d6c-baf4-8a9a2431581f" containerName="pull" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690180 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc060510-4b85-48e8-8bb0-4a4d470d8cc2" containerName="util" Apr 22 21:18:19.690228 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690188 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc060510-4b85-48e8-8bb0-4a4d470d8cc2" containerName="util" Apr 22 21:18:19.691124 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690260 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc060510-4b85-48e8-8bb0-4a4d470d8cc2" containerName="extract" Apr 22 21:18:19.691124 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690276 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="221ac19a-80b1-431f-9b74-31be7bb3ab98" containerName="extract" Apr 22 21:18:19.691124 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690288 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bab93fad-8ef3-4d6c-baf4-8a9a2431581f" containerName="extract" Apr 22 21:18:19.691124 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.690297 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f85a7385-6408-446c-9911-6446129baf21" containerName="extract" Apr 22 21:18:19.695185 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.695167 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-msf8d" Apr 22 21:18:19.697219 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.697199 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 21:18:19.697800 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.697769 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-nrf7s\"" Apr 22 21:18:19.697979 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.697953 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 21:18:19.700667 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.700647 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-msf8d"] Apr 22 21:18:19.764230 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.764196 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtq47\" (UniqueName: \"kubernetes.io/projected/7eef0b82-ed6e-4984-ab19-0c4b11aa7579-kube-api-access-qtq47\") pod \"authorino-operator-657f44b778-msf8d\" (UID: \"7eef0b82-ed6e-4984-ab19-0c4b11aa7579\") " pod="kuadrant-system/authorino-operator-657f44b778-msf8d" Apr 22 21:18:19.864841 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.864800 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtq47\" (UniqueName: \"kubernetes.io/projected/7eef0b82-ed6e-4984-ab19-0c4b11aa7579-kube-api-access-qtq47\") pod \"authorino-operator-657f44b778-msf8d\" (UID: \"7eef0b82-ed6e-4984-ab19-0c4b11aa7579\") " pod="kuadrant-system/authorino-operator-657f44b778-msf8d" Apr 22 21:18:19.872180 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:19.872147 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtq47\" (UniqueName: \"kubernetes.io/projected/7eef0b82-ed6e-4984-ab19-0c4b11aa7579-kube-api-access-qtq47\") pod \"authorino-operator-657f44b778-msf8d\" (UID: \"7eef0b82-ed6e-4984-ab19-0c4b11aa7579\") " pod="kuadrant-system/authorino-operator-657f44b778-msf8d" Apr 22 21:18:20.006773 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:20.006689 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-msf8d" Apr 22 21:18:20.129013 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:20.128986 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-msf8d"] Apr 22 21:18:20.131150 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:18:20.131113 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7eef0b82_ed6e_4984_ab19_0c4b11aa7579.slice/crio-762f8eaa6b7294be009dcf3b07aedfd723c2efb59cb1beeb3fa3f37ce89d59e1 WatchSource:0}: Error finding container 762f8eaa6b7294be009dcf3b07aedfd723c2efb59cb1beeb3fa3f37ce89d59e1: Status 404 returned error can't find the container with id 762f8eaa6b7294be009dcf3b07aedfd723c2efb59cb1beeb3fa3f37ce89d59e1 Apr 22 21:18:21.011097 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:21.011055 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-msf8d" event={"ID":"7eef0b82-ed6e-4984-ab19-0c4b11aa7579","Type":"ContainerStarted","Data":"762f8eaa6b7294be009dcf3b07aedfd723c2efb59cb1beeb3fa3f37ce89d59e1"} Apr 22 21:18:22.016353 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:22.016323 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-msf8d" event={"ID":"7eef0b82-ed6e-4984-ab19-0c4b11aa7579","Type":"ContainerStarted","Data":"2766ac691cfbae0473a8fa7342f7db529e869d439cbca4eada3613a0686e774c"} Apr 22 21:18:22.016672 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:22.016447 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-msf8d" Apr 22 21:18:22.037082 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:22.037030 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-msf8d" podStartSLOduration=1.219552272 podStartE2EDuration="3.037016014s" podCreationTimestamp="2026-04-22 21:18:19 +0000 UTC" firstStartedPulling="2026-04-22 21:18:20.133024487 +0000 UTC m=+557.489839217" lastFinishedPulling="2026-04-22 21:18:21.950488228 +0000 UTC m=+559.307302959" observedRunningTime="2026-04-22 21:18:22.034438498 +0000 UTC m=+559.391253244" watchObservedRunningTime="2026-04-22 21:18:22.037016014 +0000 UTC m=+559.393830816" Apr 22 21:18:33.022286 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:18:33.022252 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-msf8d" Apr 22 21:19:03.032526 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.032495 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:19:03.033183 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.033163 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:19:03.728465 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.728385 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-btng5"] Apr 22 21:19:03.732347 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.732325 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:03.737570 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.734621 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5sldk\"" Apr 22 21:19:03.737570 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.734919 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 21:19:03.741770 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.741743 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-btng5"] Apr 22 21:19:03.791307 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.791266 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-config-file\") pod \"limitador-limitador-7d549b5b-btng5\" (UID: \"ba417d0d-5f72-4da1-a23d-0b0e5f1804da\") " pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:03.791506 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.791327 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm7qv\" (UniqueName: \"kubernetes.io/projected/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-kube-api-access-sm7qv\") pod \"limitador-limitador-7d549b5b-btng5\" (UID: \"ba417d0d-5f72-4da1-a23d-0b0e5f1804da\") " pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:03.823931 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.823898 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-btng5"] Apr 22 21:19:03.892433 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.892381 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-config-file\") pod \"limitador-limitador-7d549b5b-btng5\" (UID: \"ba417d0d-5f72-4da1-a23d-0b0e5f1804da\") " pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:03.892598 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.892456 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sm7qv\" (UniqueName: \"kubernetes.io/projected/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-kube-api-access-sm7qv\") pod \"limitador-limitador-7d549b5b-btng5\" (UID: \"ba417d0d-5f72-4da1-a23d-0b0e5f1804da\") " pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:03.893039 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.893019 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-config-file\") pod \"limitador-limitador-7d549b5b-btng5\" (UID: \"ba417d0d-5f72-4da1-a23d-0b0e5f1804da\") " pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:03.899316 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:03.899292 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm7qv\" (UniqueName: \"kubernetes.io/projected/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-kube-api-access-sm7qv\") pod \"limitador-limitador-7d549b5b-btng5\" (UID: \"ba417d0d-5f72-4da1-a23d-0b0e5f1804da\") " pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:04.048883 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:04.048771 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:04.178732 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:04.178704 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-btng5"] Apr 22 21:19:04.181182 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:19:04.181149 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba417d0d_5f72_4da1_a23d_0b0e5f1804da.slice/crio-9be44750d183bc17313d3cb9cba39563b7ceac0850f1db52c144019c5a2316a3 WatchSource:0}: Error finding container 9be44750d183bc17313d3cb9cba39563b7ceac0850f1db52c144019c5a2316a3: Status 404 returned error can't find the container with id 9be44750d183bc17313d3cb9cba39563b7ceac0850f1db52c144019c5a2316a3 Apr 22 21:19:05.165432 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:05.165378 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" event={"ID":"ba417d0d-5f72-4da1-a23d-0b0e5f1804da","Type":"ContainerStarted","Data":"9be44750d183bc17313d3cb9cba39563b7ceac0850f1db52c144019c5a2316a3"} Apr 22 21:19:07.174096 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:07.174058 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" event={"ID":"ba417d0d-5f72-4da1-a23d-0b0e5f1804da","Type":"ContainerStarted","Data":"2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512"} Apr 22 21:19:07.174484 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:07.174213 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:07.192930 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:07.192838 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" podStartSLOduration=1.4373422599999999 podStartE2EDuration="4.192826357s" podCreationTimestamp="2026-04-22 21:19:03 +0000 UTC" firstStartedPulling="2026-04-22 21:19:04.183586488 +0000 UTC m=+601.540401222" lastFinishedPulling="2026-04-22 21:19:06.93907059 +0000 UTC m=+604.295885319" observedRunningTime="2026-04-22 21:19:07.190751639 +0000 UTC m=+604.547566403" watchObservedRunningTime="2026-04-22 21:19:07.192826357 +0000 UTC m=+604.549641109" Apr 22 21:19:18.179145 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:18.179109 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:21.010727 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:21.010687 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-btng5"] Apr 22 21:19:21.011205 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:21.010948 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" podUID="ba417d0d-5f72-4da1-a23d-0b0e5f1804da" containerName="limitador" containerID="cri-o://2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512" gracePeriod=30 Apr 22 21:19:21.946462 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:21.946436 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:22.050123 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.050094 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-config-file\") pod \"ba417d0d-5f72-4da1-a23d-0b0e5f1804da\" (UID: \"ba417d0d-5f72-4da1-a23d-0b0e5f1804da\") " Apr 22 21:19:22.050535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.050193 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm7qv\" (UniqueName: \"kubernetes.io/projected/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-kube-api-access-sm7qv\") pod \"ba417d0d-5f72-4da1-a23d-0b0e5f1804da\" (UID: \"ba417d0d-5f72-4da1-a23d-0b0e5f1804da\") " Apr 22 21:19:22.050535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.050502 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-config-file" (OuterVolumeSpecName: "config-file") pod "ba417d0d-5f72-4da1-a23d-0b0e5f1804da" (UID: "ba417d0d-5f72-4da1-a23d-0b0e5f1804da"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 21:19:22.052317 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.052284 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-kube-api-access-sm7qv" (OuterVolumeSpecName: "kube-api-access-sm7qv") pod "ba417d0d-5f72-4da1-a23d-0b0e5f1804da" (UID: "ba417d0d-5f72-4da1-a23d-0b0e5f1804da"). InnerVolumeSpecName "kube-api-access-sm7qv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:19:22.150720 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.150634 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sm7qv\" (UniqueName: \"kubernetes.io/projected/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-kube-api-access-sm7qv\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:19:22.150720 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.150673 2566 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/ba417d0d-5f72-4da1-a23d-0b0e5f1804da-config-file\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:19:22.227483 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.227446 2566 generic.go:358] "Generic (PLEG): container finished" podID="ba417d0d-5f72-4da1-a23d-0b0e5f1804da" containerID="2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512" exitCode=0 Apr 22 21:19:22.227668 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.227520 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" Apr 22 21:19:22.227668 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.227518 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" event={"ID":"ba417d0d-5f72-4da1-a23d-0b0e5f1804da","Type":"ContainerDied","Data":"2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512"} Apr 22 21:19:22.227668 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.227623 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-btng5" event={"ID":"ba417d0d-5f72-4da1-a23d-0b0e5f1804da","Type":"ContainerDied","Data":"9be44750d183bc17313d3cb9cba39563b7ceac0850f1db52c144019c5a2316a3"} Apr 22 21:19:22.227668 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.227642 2566 scope.go:117] "RemoveContainer" containerID="2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512" Apr 22 21:19:22.236612 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.236590 2566 scope.go:117] "RemoveContainer" containerID="2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512" Apr 22 21:19:22.236872 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:19:22.236853 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512\": container with ID starting with 2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512 not found: ID does not exist" containerID="2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512" Apr 22 21:19:22.236920 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.236881 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512"} err="failed to get container status \"2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512\": rpc error: code = NotFound desc = could not find container \"2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512\": container with ID starting with 2ce89252f574fa3eff62a9c07dfe0053f765dacbc32ada60cd0101a03c56a512 not found: ID does not exist" Apr 22 21:19:22.247583 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.247550 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-btng5"] Apr 22 21:19:22.252976 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:22.252949 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-btng5"] Apr 22 21:19:23.133253 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:23.133220 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba417d0d-5f72-4da1-a23d-0b0e5f1804da" path="/var/lib/kubelet/pods/ba417d0d-5f72-4da1-a23d-0b0e5f1804da/volumes" Apr 22 21:19:24.820547 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:24.820513 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-vrg5c"] Apr 22 21:19:24.821047 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:24.820967 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ba417d0d-5f72-4da1-a23d-0b0e5f1804da" containerName="limitador" Apr 22 21:19:24.821047 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:24.820988 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba417d0d-5f72-4da1-a23d-0b0e5f1804da" containerName="limitador" Apr 22 21:19:24.821175 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:24.821065 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ba417d0d-5f72-4da1-a23d-0b0e5f1804da" containerName="limitador" Apr 22 21:19:24.823845 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:24.823826 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-vrg5c" Apr 22 21:19:24.825671 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:24.825639 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 22 21:19:24.825792 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:24.825640 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-s9pgd\"" Apr 22 21:19:24.830321 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:24.830292 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-vrg5c"] Apr 22 21:19:24.972648 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:24.972611 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025-data\") pod \"postgres-868db5846d-vrg5c\" (UID: \"ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025\") " pod="opendatahub/postgres-868db5846d-vrg5c" Apr 22 21:19:24.972648 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:24.972645 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bln9v\" (UniqueName: \"kubernetes.io/projected/ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025-kube-api-access-bln9v\") pod \"postgres-868db5846d-vrg5c\" (UID: \"ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025\") " pod="opendatahub/postgres-868db5846d-vrg5c" Apr 22 21:19:25.073693 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:25.073609 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025-data\") pod \"postgres-868db5846d-vrg5c\" (UID: \"ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025\") " pod="opendatahub/postgres-868db5846d-vrg5c" Apr 22 21:19:25.073693 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:25.073651 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bln9v\" (UniqueName: \"kubernetes.io/projected/ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025-kube-api-access-bln9v\") pod \"postgres-868db5846d-vrg5c\" (UID: \"ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025\") " pod="opendatahub/postgres-868db5846d-vrg5c" Apr 22 21:19:25.073991 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:25.073969 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025-data\") pod \"postgres-868db5846d-vrg5c\" (UID: \"ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025\") " pod="opendatahub/postgres-868db5846d-vrg5c" Apr 22 21:19:25.080732 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:25.080706 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bln9v\" (UniqueName: \"kubernetes.io/projected/ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025-kube-api-access-bln9v\") pod \"postgres-868db5846d-vrg5c\" (UID: \"ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025\") " pod="opendatahub/postgres-868db5846d-vrg5c" Apr 22 21:19:25.136468 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:25.136439 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-vrg5c" Apr 22 21:19:25.257892 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:25.257864 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-vrg5c"] Apr 22 21:19:25.260035 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:19:25.260007 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee48ec0c_cc94_4b7f_9f3a_f4f896fe6025.slice/crio-58145c22aa2c3d924467cafbe1d004a077c616cdfd87f4795271849884b87d8f WatchSource:0}: Error finding container 58145c22aa2c3d924467cafbe1d004a077c616cdfd87f4795271849884b87d8f: Status 404 returned error can't find the container with id 58145c22aa2c3d924467cafbe1d004a077c616cdfd87f4795271849884b87d8f Apr 22 21:19:26.246639 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:26.246581 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-vrg5c" event={"ID":"ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025","Type":"ContainerStarted","Data":"58145c22aa2c3d924467cafbe1d004a077c616cdfd87f4795271849884b87d8f"} Apr 22 21:19:30.265043 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:30.264960 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-vrg5c" event={"ID":"ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025","Type":"ContainerStarted","Data":"dfddea5b7ebe1d6ea35b4378537686f03f36a90b86c37f517f03454e1335f139"} Apr 22 21:19:30.265379 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:30.265064 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-vrg5c" Apr 22 21:19:30.281806 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:30.281751 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-vrg5c" podStartSLOduration=1.626663609 podStartE2EDuration="6.281734296s" podCreationTimestamp="2026-04-22 21:19:24 +0000 UTC" firstStartedPulling="2026-04-22 21:19:25.261298348 +0000 UTC m=+622.618113077" lastFinishedPulling="2026-04-22 21:19:29.916369035 +0000 UTC m=+627.273183764" observedRunningTime="2026-04-22 21:19:30.277989029 +0000 UTC m=+627.634803781" watchObservedRunningTime="2026-04-22 21:19:30.281734296 +0000 UTC m=+627.638549050" Apr 22 21:19:36.297420 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:36.297374 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-vrg5c" Apr 22 21:19:39.617063 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.617028 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-flm5d"] Apr 22 21:19:39.620385 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.620363 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" Apr 22 21:19:39.622861 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.622833 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-wvptf\"" Apr 22 21:19:39.627556 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.627531 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-flm5d"] Apr 22 21:19:39.686566 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.686537 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwsl2\" (UniqueName: \"kubernetes.io/projected/773599c7-7cba-42a7-ae31-13d90d3c998b-kube-api-access-pwsl2\") pod \"maas-controller-6d4c8f55f9-flm5d\" (UID: \"773599c7-7cba-42a7-ae31-13d90d3c998b\") " pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" Apr 22 21:19:39.755426 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.755380 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-b44d57d8d-ghk72"] Apr 22 21:19:39.758906 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.758883 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b44d57d8d-ghk72" Apr 22 21:19:39.764479 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.764451 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-b44d57d8d-ghk72"] Apr 22 21:19:39.787198 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.787162 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tctvw\" (UniqueName: \"kubernetes.io/projected/08d073ec-ea7e-4ce7-8a9d-663380c209c3-kube-api-access-tctvw\") pod \"maas-controller-b44d57d8d-ghk72\" (UID: \"08d073ec-ea7e-4ce7-8a9d-663380c209c3\") " pod="opendatahub/maas-controller-b44d57d8d-ghk72" Apr 22 21:19:39.787358 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.787208 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwsl2\" (UniqueName: \"kubernetes.io/projected/773599c7-7cba-42a7-ae31-13d90d3c998b-kube-api-access-pwsl2\") pod \"maas-controller-6d4c8f55f9-flm5d\" (UID: \"773599c7-7cba-42a7-ae31-13d90d3c998b\") " pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" Apr 22 21:19:39.794705 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.794673 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwsl2\" (UniqueName: \"kubernetes.io/projected/773599c7-7cba-42a7-ae31-13d90d3c998b-kube-api-access-pwsl2\") pod \"maas-controller-6d4c8f55f9-flm5d\" (UID: \"773599c7-7cba-42a7-ae31-13d90d3c998b\") " pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" Apr 22 21:19:39.871715 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.871640 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-flm5d"] Apr 22 21:19:39.871896 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.871884 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" Apr 22 21:19:39.888035 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.888008 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tctvw\" (UniqueName: \"kubernetes.io/projected/08d073ec-ea7e-4ce7-8a9d-663380c209c3-kube-api-access-tctvw\") pod \"maas-controller-b44d57d8d-ghk72\" (UID: \"08d073ec-ea7e-4ce7-8a9d-663380c209c3\") " pod="opendatahub/maas-controller-b44d57d8d-ghk72" Apr 22 21:19:39.896755 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.896723 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-mpn66"] Apr 22 21:19:39.901720 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.901695 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" Apr 22 21:19:39.904267 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.904237 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tctvw\" (UniqueName: \"kubernetes.io/projected/08d073ec-ea7e-4ce7-8a9d-663380c209c3-kube-api-access-tctvw\") pod \"maas-controller-b44d57d8d-ghk72\" (UID: \"08d073ec-ea7e-4ce7-8a9d-663380c209c3\") " pod="opendatahub/maas-controller-b44d57d8d-ghk72" Apr 22 21:19:39.905871 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.905845 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-mpn66"] Apr 22 21:19:39.988506 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:39.988457 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbw7d\" (UniqueName: \"kubernetes.io/projected/9df6dd0d-9819-42a7-a1b7-06ca5e59f67e-kube-api-access-xbw7d\") pod \"maas-controller-5c6497bbdb-mpn66\" (UID: \"9df6dd0d-9819-42a7-a1b7-06ca5e59f67e\") " pod="opendatahub/maas-controller-5c6497bbdb-mpn66" Apr 22 21:19:40.005466 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:40.005435 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-flm5d"] Apr 22 21:19:40.007402 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:19:40.007369 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod773599c7_7cba_42a7_ae31_13d90d3c998b.slice/crio-98e8e09f903a337ac7c1863ea7e67daa5fb3d849e9711111001465ac2d8bb8f7 WatchSource:0}: Error finding container 98e8e09f903a337ac7c1863ea7e67daa5fb3d849e9711111001465ac2d8bb8f7: Status 404 returned error can't find the container with id 98e8e09f903a337ac7c1863ea7e67daa5fb3d849e9711111001465ac2d8bb8f7 Apr 22 21:19:40.070803 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:40.070756 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b44d57d8d-ghk72" Apr 22 21:19:40.088880 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:40.088848 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbw7d\" (UniqueName: \"kubernetes.io/projected/9df6dd0d-9819-42a7-a1b7-06ca5e59f67e-kube-api-access-xbw7d\") pod \"maas-controller-5c6497bbdb-mpn66\" (UID: \"9df6dd0d-9819-42a7-a1b7-06ca5e59f67e\") " pod="opendatahub/maas-controller-5c6497bbdb-mpn66" Apr 22 21:19:40.096294 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:40.096259 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbw7d\" (UniqueName: \"kubernetes.io/projected/9df6dd0d-9819-42a7-a1b7-06ca5e59f67e-kube-api-access-xbw7d\") pod \"maas-controller-5c6497bbdb-mpn66\" (UID: \"9df6dd0d-9819-42a7-a1b7-06ca5e59f67e\") " pod="opendatahub/maas-controller-5c6497bbdb-mpn66" Apr 22 21:19:40.191807 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:40.191780 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-b44d57d8d-ghk72"] Apr 22 21:19:40.193565 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:19:40.193528 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08d073ec_ea7e_4ce7_8a9d_663380c209c3.slice/crio-b81fadab98341566b26ef844aa893fe888ba748b7628a6da60950a7857fb407e WatchSource:0}: Error finding container b81fadab98341566b26ef844aa893fe888ba748b7628a6da60950a7857fb407e: Status 404 returned error can't find the container with id b81fadab98341566b26ef844aa893fe888ba748b7628a6da60950a7857fb407e Apr 22 21:19:40.222714 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:40.222679 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" Apr 22 21:19:40.303565 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:40.303529 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b44d57d8d-ghk72" event={"ID":"08d073ec-ea7e-4ce7-8a9d-663380c209c3","Type":"ContainerStarted","Data":"b81fadab98341566b26ef844aa893fe888ba748b7628a6da60950a7857fb407e"} Apr 22 21:19:40.305176 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:40.305104 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" event={"ID":"773599c7-7cba-42a7-ae31-13d90d3c998b","Type":"ContainerStarted","Data":"98e8e09f903a337ac7c1863ea7e67daa5fb3d849e9711111001465ac2d8bb8f7"} Apr 22 21:19:40.348486 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:40.348459 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-mpn66"] Apr 22 21:19:40.350035 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:19:40.350002 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9df6dd0d_9819_42a7_a1b7_06ca5e59f67e.slice/crio-669f0e6e416068e626710e32c2bc479a61a0dc9192651f02e9d936bccbc4a071 WatchSource:0}: Error finding container 669f0e6e416068e626710e32c2bc479a61a0dc9192651f02e9d936bccbc4a071: Status 404 returned error can't find the container with id 669f0e6e416068e626710e32c2bc479a61a0dc9192651f02e9d936bccbc4a071 Apr 22 21:19:41.314988 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:41.314926 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" event={"ID":"9df6dd0d-9819-42a7-a1b7-06ca5e59f67e","Type":"ContainerStarted","Data":"669f0e6e416068e626710e32c2bc479a61a0dc9192651f02e9d936bccbc4a071"} Apr 22 21:19:44.328955 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.328923 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b44d57d8d-ghk72" event={"ID":"08d073ec-ea7e-4ce7-8a9d-663380c209c3","Type":"ContainerStarted","Data":"8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea"} Apr 22 21:19:44.329449 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.329037 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-b44d57d8d-ghk72" Apr 22 21:19:44.330326 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.330298 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" event={"ID":"9df6dd0d-9819-42a7-a1b7-06ca5e59f67e","Type":"ContainerStarted","Data":"fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe"} Apr 22 21:19:44.330479 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.330398 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" Apr 22 21:19:44.331595 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.331570 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" event={"ID":"773599c7-7cba-42a7-ae31-13d90d3c998b","Type":"ContainerStarted","Data":"6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5"} Apr 22 21:19:44.331672 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.331637 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" podUID="773599c7-7cba-42a7-ae31-13d90d3c998b" containerName="manager" containerID="cri-o://6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5" gracePeriod=10 Apr 22 21:19:44.331731 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.331682 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" Apr 22 21:19:44.359549 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.359488 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-b44d57d8d-ghk72" podStartSLOduration=2.026645352 podStartE2EDuration="5.359471018s" podCreationTimestamp="2026-04-22 21:19:39 +0000 UTC" firstStartedPulling="2026-04-22 21:19:40.194895923 +0000 UTC m=+637.551710657" lastFinishedPulling="2026-04-22 21:19:43.527721593 +0000 UTC m=+640.884536323" observedRunningTime="2026-04-22 21:19:44.355368161 +0000 UTC m=+641.712182914" watchObservedRunningTime="2026-04-22 21:19:44.359471018 +0000 UTC m=+641.716285771" Apr 22 21:19:44.386449 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.386377 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" podStartSLOduration=1.867567518 podStartE2EDuration="5.386362692s" podCreationTimestamp="2026-04-22 21:19:39 +0000 UTC" firstStartedPulling="2026-04-22 21:19:40.008818227 +0000 UTC m=+637.365632961" lastFinishedPulling="2026-04-22 21:19:43.527613386 +0000 UTC m=+640.884428135" observedRunningTime="2026-04-22 21:19:44.384855835 +0000 UTC m=+641.741670598" watchObservedRunningTime="2026-04-22 21:19:44.386362692 +0000 UTC m=+641.743177448" Apr 22 21:19:44.428899 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.428848 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" podStartSLOduration=2.245702193 podStartE2EDuration="5.428832895s" podCreationTimestamp="2026-04-22 21:19:39 +0000 UTC" firstStartedPulling="2026-04-22 21:19:40.351459666 +0000 UTC m=+637.708274399" lastFinishedPulling="2026-04-22 21:19:43.534590372 +0000 UTC m=+640.891405101" observedRunningTime="2026-04-22 21:19:44.426936379 +0000 UTC m=+641.783751130" watchObservedRunningTime="2026-04-22 21:19:44.428832895 +0000 UTC m=+641.785647647" Apr 22 21:19:44.579960 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.579900 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" Apr 22 21:19:44.627638 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.627609 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwsl2\" (UniqueName: \"kubernetes.io/projected/773599c7-7cba-42a7-ae31-13d90d3c998b-kube-api-access-pwsl2\") pod \"773599c7-7cba-42a7-ae31-13d90d3c998b\" (UID: \"773599c7-7cba-42a7-ae31-13d90d3c998b\") " Apr 22 21:19:44.629740 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.629716 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/773599c7-7cba-42a7-ae31-13d90d3c998b-kube-api-access-pwsl2" (OuterVolumeSpecName: "kube-api-access-pwsl2") pod "773599c7-7cba-42a7-ae31-13d90d3c998b" (UID: "773599c7-7cba-42a7-ae31-13d90d3c998b"). InnerVolumeSpecName "kube-api-access-pwsl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:19:44.728725 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:44.728687 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pwsl2\" (UniqueName: \"kubernetes.io/projected/773599c7-7cba-42a7-ae31-13d90d3c998b-kube-api-access-pwsl2\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:19:45.335762 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.335726 2566 generic.go:358] "Generic (PLEG): container finished" podID="773599c7-7cba-42a7-ae31-13d90d3c998b" containerID="6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5" exitCode=0 Apr 22 21:19:45.336244 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.335794 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" Apr 22 21:19:45.336244 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.335816 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" event={"ID":"773599c7-7cba-42a7-ae31-13d90d3c998b","Type":"ContainerDied","Data":"6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5"} Apr 22 21:19:45.336244 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.335866 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-flm5d" event={"ID":"773599c7-7cba-42a7-ae31-13d90d3c998b","Type":"ContainerDied","Data":"98e8e09f903a337ac7c1863ea7e67daa5fb3d849e9711111001465ac2d8bb8f7"} Apr 22 21:19:45.336244 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.335884 2566 scope.go:117] "RemoveContainer" containerID="6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5" Apr 22 21:19:45.344262 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.344245 2566 scope.go:117] "RemoveContainer" containerID="6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5" Apr 22 21:19:45.344559 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:19:45.344539 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5\": container with ID starting with 6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5 not found: ID does not exist" containerID="6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5" Apr 22 21:19:45.344638 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.344567 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5"} err="failed to get container status \"6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5\": rpc error: code = NotFound desc = could not find container \"6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5\": container with ID starting with 6e56adb149e8412122687c37db37402cad465212f9d3c2b93a1e3f5662765fb5 not found: ID does not exist" Apr 22 21:19:45.351317 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.351291 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-flm5d"] Apr 22 21:19:45.353200 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.353177 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-flm5d"] Apr 22 21:19:45.656288 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.656255 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6c5c5db65c-k7mmd"] Apr 22 21:19:45.656599 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.656584 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="773599c7-7cba-42a7-ae31-13d90d3c998b" containerName="manager" Apr 22 21:19:45.656657 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.656600 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="773599c7-7cba-42a7-ae31-13d90d3c998b" containerName="manager" Apr 22 21:19:45.656690 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.656662 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="773599c7-7cba-42a7-ae31-13d90d3c998b" containerName="manager" Apr 22 21:19:45.660726 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.660707 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:19:45.662642 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.662618 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 21:19:45.662771 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.662670 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 21:19:45.662771 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.662692 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-p9m6q\"" Apr 22 21:19:45.666396 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.666368 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6c5c5db65c-k7mmd"] Apr 22 21:19:45.735275 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.735238 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ab033c72-3077-4a4b-9d6c-06f84462f82e-maas-api-tls\") pod \"maas-api-6c5c5db65c-k7mmd\" (UID: \"ab033c72-3077-4a4b-9d6c-06f84462f82e\") " pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:19:45.735447 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.735285 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhfxd\" (UniqueName: \"kubernetes.io/projected/ab033c72-3077-4a4b-9d6c-06f84462f82e-kube-api-access-qhfxd\") pod \"maas-api-6c5c5db65c-k7mmd\" (UID: \"ab033c72-3077-4a4b-9d6c-06f84462f82e\") " pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:19:45.836401 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.836361 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhfxd\" (UniqueName: \"kubernetes.io/projected/ab033c72-3077-4a4b-9d6c-06f84462f82e-kube-api-access-qhfxd\") pod \"maas-api-6c5c5db65c-k7mmd\" (UID: \"ab033c72-3077-4a4b-9d6c-06f84462f82e\") " pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:19:45.836630 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.836489 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ab033c72-3077-4a4b-9d6c-06f84462f82e-maas-api-tls\") pod \"maas-api-6c5c5db65c-k7mmd\" (UID: \"ab033c72-3077-4a4b-9d6c-06f84462f82e\") " pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:19:45.838906 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.838880 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ab033c72-3077-4a4b-9d6c-06f84462f82e-maas-api-tls\") pod \"maas-api-6c5c5db65c-k7mmd\" (UID: \"ab033c72-3077-4a4b-9d6c-06f84462f82e\") " pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:19:45.845185 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.845161 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhfxd\" (UniqueName: \"kubernetes.io/projected/ab033c72-3077-4a4b-9d6c-06f84462f82e-kube-api-access-qhfxd\") pod \"maas-api-6c5c5db65c-k7mmd\" (UID: \"ab033c72-3077-4a4b-9d6c-06f84462f82e\") " pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:19:45.973020 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:45.972925 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:19:46.095109 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:46.095081 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6c5c5db65c-k7mmd"] Apr 22 21:19:46.097426 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:19:46.097384 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab033c72_3077_4a4b_9d6c_06f84462f82e.slice/crio-7bfdf794941d5229949faa1cd070d6835276600fe95a4a4a0dad15586a35e7d5 WatchSource:0}: Error finding container 7bfdf794941d5229949faa1cd070d6835276600fe95a4a4a0dad15586a35e7d5: Status 404 returned error can't find the container with id 7bfdf794941d5229949faa1cd070d6835276600fe95a4a4a0dad15586a35e7d5 Apr 22 21:19:46.341823 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:46.341780 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" event={"ID":"ab033c72-3077-4a4b-9d6c-06f84462f82e","Type":"ContainerStarted","Data":"7bfdf794941d5229949faa1cd070d6835276600fe95a4a4a0dad15586a35e7d5"} Apr 22 21:19:47.132807 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:47.132774 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="773599c7-7cba-42a7-ae31-13d90d3c998b" path="/var/lib/kubelet/pods/773599c7-7cba-42a7-ae31-13d90d3c998b/volumes" Apr 22 21:19:47.350442 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:47.350387 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" event={"ID":"ab033c72-3077-4a4b-9d6c-06f84462f82e","Type":"ContainerStarted","Data":"754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f"} Apr 22 21:19:47.350862 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:47.350527 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:19:47.364514 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:47.364463 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" podStartSLOduration=1.232701149 podStartE2EDuration="2.364448644s" podCreationTimestamp="2026-04-22 21:19:45 +0000 UTC" firstStartedPulling="2026-04-22 21:19:46.098848041 +0000 UTC m=+643.455662774" lastFinishedPulling="2026-04-22 21:19:47.230595536 +0000 UTC m=+644.587410269" observedRunningTime="2026-04-22 21:19:47.363269226 +0000 UTC m=+644.720083978" watchObservedRunningTime="2026-04-22 21:19:47.364448644 +0000 UTC m=+644.721263395" Apr 22 21:19:53.359495 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:53.359467 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:19:55.341162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.341129 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-b44d57d8d-ghk72" Apr 22 21:19:55.341652 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.341182 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" Apr 22 21:19:55.390875 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.390840 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-b44d57d8d-ghk72"] Apr 22 21:19:55.391115 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.391065 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-b44d57d8d-ghk72" podUID="08d073ec-ea7e-4ce7-8a9d-663380c209c3" containerName="manager" containerID="cri-o://8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea" gracePeriod=10 Apr 22 21:19:55.631820 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.630858 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b44d57d8d-ghk72" Apr 22 21:19:55.673519 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.673478 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-9dd94f74c-8g925"] Apr 22 21:19:55.673908 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.673875 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="08d073ec-ea7e-4ce7-8a9d-663380c209c3" containerName="manager" Apr 22 21:19:55.673908 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.673899 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d073ec-ea7e-4ce7-8a9d-663380c209c3" containerName="manager" Apr 22 21:19:55.674064 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.674004 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="08d073ec-ea7e-4ce7-8a9d-663380c209c3" containerName="manager" Apr 22 21:19:55.677035 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.677017 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-9dd94f74c-8g925" Apr 22 21:19:55.683527 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.683461 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-9dd94f74c-8g925"] Apr 22 21:19:55.715176 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.715143 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tctvw\" (UniqueName: \"kubernetes.io/projected/08d073ec-ea7e-4ce7-8a9d-663380c209c3-kube-api-access-tctvw\") pod \"08d073ec-ea7e-4ce7-8a9d-663380c209c3\" (UID: \"08d073ec-ea7e-4ce7-8a9d-663380c209c3\") " Apr 22 21:19:55.715382 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.715363 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqbsp\" (UniqueName: \"kubernetes.io/projected/680b5706-4a69-4171-802e-f4ec36919d2b-kube-api-access-fqbsp\") pod \"maas-controller-9dd94f74c-8g925\" (UID: \"680b5706-4a69-4171-802e-f4ec36919d2b\") " pod="opendatahub/maas-controller-9dd94f74c-8g925" Apr 22 21:19:55.717201 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.717170 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d073ec-ea7e-4ce7-8a9d-663380c209c3-kube-api-access-tctvw" (OuterVolumeSpecName: "kube-api-access-tctvw") pod "08d073ec-ea7e-4ce7-8a9d-663380c209c3" (UID: "08d073ec-ea7e-4ce7-8a9d-663380c209c3"). InnerVolumeSpecName "kube-api-access-tctvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:19:55.816777 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.816741 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqbsp\" (UniqueName: \"kubernetes.io/projected/680b5706-4a69-4171-802e-f4ec36919d2b-kube-api-access-fqbsp\") pod \"maas-controller-9dd94f74c-8g925\" (UID: \"680b5706-4a69-4171-802e-f4ec36919d2b\") " pod="opendatahub/maas-controller-9dd94f74c-8g925" Apr 22 21:19:55.816993 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.816805 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tctvw\" (UniqueName: \"kubernetes.io/projected/08d073ec-ea7e-4ce7-8a9d-663380c209c3-kube-api-access-tctvw\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:19:55.824342 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.824318 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqbsp\" (UniqueName: \"kubernetes.io/projected/680b5706-4a69-4171-802e-f4ec36919d2b-kube-api-access-fqbsp\") pod \"maas-controller-9dd94f74c-8g925\" (UID: \"680b5706-4a69-4171-802e-f4ec36919d2b\") " pod="opendatahub/maas-controller-9dd94f74c-8g925" Apr 22 21:19:55.988662 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:55.988556 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-9dd94f74c-8g925" Apr 22 21:19:56.109205 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.109178 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-9dd94f74c-8g925"] Apr 22 21:19:56.111292 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:19:56.111263 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod680b5706_4a69_4171_802e_f4ec36919d2b.slice/crio-36a4b47bd0e4a1f383f49cabcad8a630e6e639d9c4f11ff6f17084210414794f WatchSource:0}: Error finding container 36a4b47bd0e4a1f383f49cabcad8a630e6e639d9c4f11ff6f17084210414794f: Status 404 returned error can't find the container with id 36a4b47bd0e4a1f383f49cabcad8a630e6e639d9c4f11ff6f17084210414794f Apr 22 21:19:56.383146 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.383117 2566 generic.go:358] "Generic (PLEG): container finished" podID="08d073ec-ea7e-4ce7-8a9d-663380c209c3" containerID="8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea" exitCode=0 Apr 22 21:19:56.383611 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.383181 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-b44d57d8d-ghk72" Apr 22 21:19:56.383611 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.383207 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b44d57d8d-ghk72" event={"ID":"08d073ec-ea7e-4ce7-8a9d-663380c209c3","Type":"ContainerDied","Data":"8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea"} Apr 22 21:19:56.383611 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.383253 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-b44d57d8d-ghk72" event={"ID":"08d073ec-ea7e-4ce7-8a9d-663380c209c3","Type":"ContainerDied","Data":"b81fadab98341566b26ef844aa893fe888ba748b7628a6da60950a7857fb407e"} Apr 22 21:19:56.383611 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.383279 2566 scope.go:117] "RemoveContainer" containerID="8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea" Apr 22 21:19:56.384939 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.384915 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-9dd94f74c-8g925" event={"ID":"680b5706-4a69-4171-802e-f4ec36919d2b","Type":"ContainerStarted","Data":"36a4b47bd0e4a1f383f49cabcad8a630e6e639d9c4f11ff6f17084210414794f"} Apr 22 21:19:56.391369 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.391254 2566 scope.go:117] "RemoveContainer" containerID="8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea" Apr 22 21:19:56.391601 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:19:56.391574 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea\": container with ID starting with 8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea not found: ID does not exist" containerID="8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea" Apr 22 21:19:56.391763 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.391732 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea"} err="failed to get container status \"8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea\": rpc error: code = NotFound desc = could not find container \"8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea\": container with ID starting with 8687ee7536e5ee8df1a2a75b6d2dd71cbce281786e4d6264ae96197e09d695ea not found: ID does not exist" Apr 22 21:19:56.408162 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.408124 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-b44d57d8d-ghk72"] Apr 22 21:19:56.411450 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:56.411424 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-b44d57d8d-ghk72"] Apr 22 21:19:57.132810 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:57.132773 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d073ec-ea7e-4ce7-8a9d-663380c209c3" path="/var/lib/kubelet/pods/08d073ec-ea7e-4ce7-8a9d-663380c209c3/volumes" Apr 22 21:19:57.390330 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:57.390241 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-9dd94f74c-8g925" event={"ID":"680b5706-4a69-4171-802e-f4ec36919d2b","Type":"ContainerStarted","Data":"89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1"} Apr 22 21:19:57.390729 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:57.390391 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-9dd94f74c-8g925" Apr 22 21:19:57.405672 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:19:57.405623 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-9dd94f74c-8g925" podStartSLOduration=2.134443825 podStartE2EDuration="2.405604881s" podCreationTimestamp="2026-04-22 21:19:55 +0000 UTC" firstStartedPulling="2026-04-22 21:19:56.112649762 +0000 UTC m=+653.469464495" lastFinishedPulling="2026-04-22 21:19:56.383810804 +0000 UTC m=+653.740625551" observedRunningTime="2026-04-22 21:19:57.40337777 +0000 UTC m=+654.760192536" watchObservedRunningTime="2026-04-22 21:19:57.405604881 +0000 UTC m=+654.762419633" Apr 22 21:20:08.400915 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:08.400887 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-9dd94f74c-8g925" Apr 22 21:20:08.438498 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:08.438462 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-mpn66"] Apr 22 21:20:08.438734 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:08.438710 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" podUID="9df6dd0d-9819-42a7-a1b7-06ca5e59f67e" containerName="manager" containerID="cri-o://fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe" gracePeriod=10 Apr 22 21:20:08.682185 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:08.682159 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" Apr 22 21:20:08.821046 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:08.821007 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbw7d\" (UniqueName: \"kubernetes.io/projected/9df6dd0d-9819-42a7-a1b7-06ca5e59f67e-kube-api-access-xbw7d\") pod \"9df6dd0d-9819-42a7-a1b7-06ca5e59f67e\" (UID: \"9df6dd0d-9819-42a7-a1b7-06ca5e59f67e\") " Apr 22 21:20:08.823097 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:08.823062 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9df6dd0d-9819-42a7-a1b7-06ca5e59f67e-kube-api-access-xbw7d" (OuterVolumeSpecName: "kube-api-access-xbw7d") pod "9df6dd0d-9819-42a7-a1b7-06ca5e59f67e" (UID: "9df6dd0d-9819-42a7-a1b7-06ca5e59f67e"). InnerVolumeSpecName "kube-api-access-xbw7d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:20:08.922193 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:08.922102 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xbw7d\" (UniqueName: \"kubernetes.io/projected/9df6dd0d-9819-42a7-a1b7-06ca5e59f67e-kube-api-access-xbw7d\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:20:09.437158 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:09.437126 2566 generic.go:358] "Generic (PLEG): container finished" podID="9df6dd0d-9819-42a7-a1b7-06ca5e59f67e" containerID="fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe" exitCode=0 Apr 22 21:20:09.437586 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:09.437174 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" event={"ID":"9df6dd0d-9819-42a7-a1b7-06ca5e59f67e","Type":"ContainerDied","Data":"fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe"} Apr 22 21:20:09.437586 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:09.437186 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" Apr 22 21:20:09.437586 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:09.437208 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c6497bbdb-mpn66" event={"ID":"9df6dd0d-9819-42a7-a1b7-06ca5e59f67e","Type":"ContainerDied","Data":"669f0e6e416068e626710e32c2bc479a61a0dc9192651f02e9d936bccbc4a071"} Apr 22 21:20:09.437586 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:09.437231 2566 scope.go:117] "RemoveContainer" containerID="fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe" Apr 22 21:20:09.445649 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:09.445437 2566 scope.go:117] "RemoveContainer" containerID="fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe" Apr 22 21:20:09.445717 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:20:09.445693 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe\": container with ID starting with fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe not found: ID does not exist" containerID="fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe" Apr 22 21:20:09.445757 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:09.445725 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe"} err="failed to get container status \"fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe\": rpc error: code = NotFound desc = could not find container \"fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe\": container with ID starting with fc65d9102777541fdbfc31d33a47fba6835b47ef85e2840bcc7e9d00133365fe not found: ID does not exist" Apr 22 21:20:09.453115 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:09.453091 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-mpn66"] Apr 22 21:20:09.456141 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:09.456116 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5c6497bbdb-mpn66"] Apr 22 21:20:11.132919 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:11.132876 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9df6dd0d-9819-42a7-a1b7-06ca5e59f67e" path="/var/lib/kubelet/pods/9df6dd0d-9819-42a7-a1b7-06ca5e59f67e/volumes" Apr 22 21:20:15.286535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.286502 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5"] Apr 22 21:20:15.286916 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.286891 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9df6dd0d-9819-42a7-a1b7-06ca5e59f67e" containerName="manager" Apr 22 21:20:15.286916 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.286905 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9df6dd0d-9819-42a7-a1b7-06ca5e59f67e" containerName="manager" Apr 22 21:20:15.287005 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.286960 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="9df6dd0d-9819-42a7-a1b7-06ca5e59f67e" containerName="manager" Apr 22 21:20:15.339483 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.339442 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78"] Apr 22 21:20:15.339686 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.339550 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.341830 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.341765 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 21:20:15.342016 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.341995 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 21:20:15.342145 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.342050 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-8wbgm\"" Apr 22 21:20:15.342202 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.342181 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 22 21:20:15.356563 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.356536 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5"] Apr 22 21:20:15.356563 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.356563 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78"] Apr 22 21:20:15.356730 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.356676 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.358506 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.358487 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 22 21:20:15.475080 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475045 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.475080 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475085 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.475309 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475102 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.475309 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475165 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.475309 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475213 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.475309 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475240 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.475309 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475272 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.475509 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475338 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xddf\" (UniqueName: \"kubernetes.io/projected/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-kube-api-access-4xddf\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.475509 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475381 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.475509 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475423 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.475509 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475464 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.475509 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.475493 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2jpm\" (UniqueName: \"kubernetes.io/projected/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-kube-api-access-v2jpm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.576280 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576244 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.576280 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576280 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.576534 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576309 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.576534 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576331 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.576534 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576472 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.576534 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576515 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.576749 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576559 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4xddf\" (UniqueName: \"kubernetes.io/projected/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-kube-api-access-4xddf\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.576749 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576593 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.576749 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576622 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.576749 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576729 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.576749 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576732 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.577033 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576732 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.577033 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576813 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2jpm\" (UniqueName: \"kubernetes.io/projected/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-kube-api-access-v2jpm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.577033 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576969 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.577033 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.576987 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.577226 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.577071 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.577226 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.577155 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.577364 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.577327 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.579156 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.579134 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.579443 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.579397 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.579650 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.579627 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.579693 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.579663 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.583635 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.583612 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2jpm\" (UniqueName: \"kubernetes.io/projected/5caa91a4-eb80-43c5-8902-f7ab4d118bc0-kube-api-access-v2jpm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5\" (UID: \"5caa91a4-eb80-43c5-8902-f7ab4d118bc0\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.583799 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.583779 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xddf\" (UniqueName: \"kubernetes.io/projected/56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd-kube-api-access-4xddf\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78\" (UID: \"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.651159 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.650821 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:15.665821 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.665793 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:15.793936 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.793904 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5"] Apr 22 21:20:15.797273 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:20:15.797239 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5caa91a4_eb80_43c5_8902_f7ab4d118bc0.slice/crio-35f1b8129301eb6ac2744bfcbaff4dd0fbbedb469248ab5692fff596aa734fda WatchSource:0}: Error finding container 35f1b8129301eb6ac2744bfcbaff4dd0fbbedb469248ab5692fff596aa734fda: Status 404 returned error can't find the container with id 35f1b8129301eb6ac2744bfcbaff4dd0fbbedb469248ab5692fff596aa734fda Apr 22 21:20:15.812140 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:15.812090 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78"] Apr 22 21:20:15.813981 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:20:15.813952 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56da6d8f_0bf7_4c9a_9ebf_3424bcccedbd.slice/crio-1bda5d0331183318ee8d0e2868757d4068bb956cfd8760ee635e5df925b87425 WatchSource:0}: Error finding container 1bda5d0331183318ee8d0e2868757d4068bb956cfd8760ee635e5df925b87425: Status 404 returned error can't find the container with id 1bda5d0331183318ee8d0e2868757d4068bb956cfd8760ee635e5df925b87425 Apr 22 21:20:16.467480 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:16.467445 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" event={"ID":"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd","Type":"ContainerStarted","Data":"1bda5d0331183318ee8d0e2868757d4068bb956cfd8760ee635e5df925b87425"} Apr 22 21:20:16.468688 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:16.468654 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" event={"ID":"5caa91a4-eb80-43c5-8902-f7ab4d118bc0","Type":"ContainerStarted","Data":"35f1b8129301eb6ac2744bfcbaff4dd0fbbedb469248ab5692fff596aa734fda"} Apr 22 21:20:22.498075 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:22.498033 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" event={"ID":"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd","Type":"ContainerStarted","Data":"c2afa66c3ac679f8f1b845cca0cec2af2c8c5396b4bda56cd42ef8edf9ed3f26"} Apr 22 21:20:22.499689 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:22.499665 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" event={"ID":"5caa91a4-eb80-43c5-8902-f7ab4d118bc0","Type":"ContainerStarted","Data":"51315bc9c5327094eca1e24f8f5b75d7202a875f5a2a8bee4cbdde25db194f94"} Apr 22 21:20:28.522605 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:28.522574 2566 generic.go:358] "Generic (PLEG): container finished" podID="56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd" containerID="c2afa66c3ac679f8f1b845cca0cec2af2c8c5396b4bda56cd42ef8edf9ed3f26" exitCode=0 Apr 22 21:20:28.523034 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:28.522643 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" event={"ID":"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd","Type":"ContainerDied","Data":"c2afa66c3ac679f8f1b845cca0cec2af2c8c5396b4bda56cd42ef8edf9ed3f26"} Apr 22 21:20:28.523780 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:28.523753 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:20:28.529565 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:28.529542 2566 generic.go:358] "Generic (PLEG): container finished" podID="5caa91a4-eb80-43c5-8902-f7ab4d118bc0" containerID="51315bc9c5327094eca1e24f8f5b75d7202a875f5a2a8bee4cbdde25db194f94" exitCode=0 Apr 22 21:20:28.529669 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:28.529616 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" event={"ID":"5caa91a4-eb80-43c5-8902-f7ab4d118bc0","Type":"ContainerDied","Data":"51315bc9c5327094eca1e24f8f5b75d7202a875f5a2a8bee4cbdde25db194f94"} Apr 22 21:20:30.540170 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:30.540130 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" event={"ID":"56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd","Type":"ContainerStarted","Data":"f9a01277cde23b8ebf38559d2e75f9bd38d3ca40e3d7b24396ad81bfb02e60b6"} Apr 22 21:20:30.540992 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:30.540970 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:30.542953 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:30.542922 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" event={"ID":"5caa91a4-eb80-43c5-8902-f7ab4d118bc0","Type":"ContainerStarted","Data":"a96e95f876740c0a5e7b35d994971a64b3ab5e0f3ec6b3feed9ad2b2b6b928d8"} Apr 22 21:20:30.543174 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:30.543154 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:30.557983 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:30.557931 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" podStartSLOduration=1.538169007 podStartE2EDuration="15.557914566s" podCreationTimestamp="2026-04-22 21:20:15 +0000 UTC" firstStartedPulling="2026-04-22 21:20:15.815717771 +0000 UTC m=+673.172532501" lastFinishedPulling="2026-04-22 21:20:29.835463317 +0000 UTC m=+687.192278060" observedRunningTime="2026-04-22 21:20:30.555558455 +0000 UTC m=+687.912373220" watchObservedRunningTime="2026-04-22 21:20:30.557914566 +0000 UTC m=+687.914729416" Apr 22 21:20:30.572656 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:30.572612 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" podStartSLOduration=1.540766961 podStartE2EDuration="15.572599463s" podCreationTimestamp="2026-04-22 21:20:15 +0000 UTC" firstStartedPulling="2026-04-22 21:20:15.799033197 +0000 UTC m=+673.155847930" lastFinishedPulling="2026-04-22 21:20:29.830865697 +0000 UTC m=+687.187680432" observedRunningTime="2026-04-22 21:20:30.569624471 +0000 UTC m=+687.926439236" watchObservedRunningTime="2026-04-22 21:20:30.572599463 +0000 UTC m=+687.929414215" Apr 22 21:20:38.330943 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.330901 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6c5c5db65c-k7mmd"] Apr 22 21:20:38.331430 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.331193 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" podUID="ab033c72-3077-4a4b-9d6c-06f84462f82e" containerName="maas-api" containerID="cri-o://754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f" gracePeriod=30 Apr 22 21:20:38.354923 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.354887 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" podUID="ab033c72-3077-4a4b-9d6c-06f84462f82e" containerName="maas-api" probeResult="failure" output="Get \"https://10.132.0.37:8443/health\": dial tcp 10.132.0.37:8443: connect: connection refused" Apr 22 21:20:38.571652 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.571626 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:20:38.573932 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.573907 2566 generic.go:358] "Generic (PLEG): container finished" podID="ab033c72-3077-4a4b-9d6c-06f84462f82e" containerID="754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f" exitCode=0 Apr 22 21:20:38.574003 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.573946 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" event={"ID":"ab033c72-3077-4a4b-9d6c-06f84462f82e","Type":"ContainerDied","Data":"754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f"} Apr 22 21:20:38.574003 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.573968 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" Apr 22 21:20:38.574003 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.573985 2566 scope.go:117] "RemoveContainer" containerID="754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f" Apr 22 21:20:38.574125 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.573973 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6c5c5db65c-k7mmd" event={"ID":"ab033c72-3077-4a4b-9d6c-06f84462f82e","Type":"ContainerDied","Data":"7bfdf794941d5229949faa1cd070d6835276600fe95a4a4a0dad15586a35e7d5"} Apr 22 21:20:38.582078 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.582054 2566 scope.go:117] "RemoveContainer" containerID="754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f" Apr 22 21:20:38.582371 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:20:38.582349 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f\": container with ID starting with 754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f not found: ID does not exist" containerID="754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f" Apr 22 21:20:38.582496 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.582377 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f"} err="failed to get container status \"754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f\": rpc error: code = NotFound desc = could not find container \"754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f\": container with ID starting with 754f13f9c6b24e4e9406020b382b3ae239fa7eb7ffb48ff4c600249c89975c7f not found: ID does not exist" Apr 22 21:20:38.681634 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.681595 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ab033c72-3077-4a4b-9d6c-06f84462f82e-maas-api-tls\") pod \"ab033c72-3077-4a4b-9d6c-06f84462f82e\" (UID: \"ab033c72-3077-4a4b-9d6c-06f84462f82e\") " Apr 22 21:20:38.681820 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.681653 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhfxd\" (UniqueName: \"kubernetes.io/projected/ab033c72-3077-4a4b-9d6c-06f84462f82e-kube-api-access-qhfxd\") pod \"ab033c72-3077-4a4b-9d6c-06f84462f82e\" (UID: \"ab033c72-3077-4a4b-9d6c-06f84462f82e\") " Apr 22 21:20:38.683831 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.683796 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab033c72-3077-4a4b-9d6c-06f84462f82e-kube-api-access-qhfxd" (OuterVolumeSpecName: "kube-api-access-qhfxd") pod "ab033c72-3077-4a4b-9d6c-06f84462f82e" (UID: "ab033c72-3077-4a4b-9d6c-06f84462f82e"). InnerVolumeSpecName "kube-api-access-qhfxd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:20:38.683831 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.683814 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab033c72-3077-4a4b-9d6c-06f84462f82e-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "ab033c72-3077-4a4b-9d6c-06f84462f82e" (UID: "ab033c72-3077-4a4b-9d6c-06f84462f82e"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 21:20:38.782764 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.782725 2566 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/ab033c72-3077-4a4b-9d6c-06f84462f82e-maas-api-tls\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:20:38.782764 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.782757 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qhfxd\" (UniqueName: \"kubernetes.io/projected/ab033c72-3077-4a4b-9d6c-06f84462f82e-kube-api-access-qhfxd\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:20:38.901935 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.901897 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6c5c5db65c-k7mmd"] Apr 22 21:20:38.905932 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:38.905906 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-6c5c5db65c-k7mmd"] Apr 22 21:20:39.133105 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:39.133027 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab033c72-3077-4a4b-9d6c-06f84462f82e" path="/var/lib/kubelet/pods/ab033c72-3077-4a4b-9d6c-06f84462f82e/volumes" Apr 22 21:20:41.560692 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:41.560657 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5" Apr 22 21:20:41.561508 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:41.561487 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78" Apr 22 21:20:45.580912 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.580872 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr"] Apr 22 21:20:45.581297 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.581189 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab033c72-3077-4a4b-9d6c-06f84462f82e" containerName="maas-api" Apr 22 21:20:45.581297 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.581200 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab033c72-3077-4a4b-9d6c-06f84462f82e" containerName="maas-api" Apr 22 21:20:45.581297 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.581258 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab033c72-3077-4a4b-9d6c-06f84462f82e" containerName="maas-api" Apr 22 21:20:45.585744 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.585721 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.587455 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.587434 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 22 21:20:45.591192 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.591163 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr"] Apr 22 21:20:45.642063 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.642020 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cceb139a-05b1-40ab-84b9-bace919e6517-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.642265 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.642077 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.642265 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.642117 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.642265 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.642157 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.642434 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.642258 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw5lk\" (UniqueName: \"kubernetes.io/projected/cceb139a-05b1-40ab-84b9-bace919e6517-kube-api-access-xw5lk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.642434 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.642310 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.743218 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.743170 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cceb139a-05b1-40ab-84b9-bace919e6517-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.743434 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.743235 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.743434 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.743279 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.743434 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.743323 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.743434 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.743373 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xw5lk\" (UniqueName: \"kubernetes.io/projected/cceb139a-05b1-40ab-84b9-bace919e6517-kube-api-access-xw5lk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.743434 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.743424 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.744134 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.744081 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.744134 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.744106 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.744324 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.744141 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.745722 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.745704 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cceb139a-05b1-40ab-84b9-bace919e6517-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.746258 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.746235 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cceb139a-05b1-40ab-84b9-bace919e6517-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.758446 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.758405 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw5lk\" (UniqueName: \"kubernetes.io/projected/cceb139a-05b1-40ab-84b9-bace919e6517-kube-api-access-xw5lk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr\" (UID: \"cceb139a-05b1-40ab-84b9-bace919e6517\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:45.897172 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:45.897075 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:46.042305 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:46.042258 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr"] Apr 22 21:20:46.046226 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:20:46.046196 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcceb139a_05b1_40ab_84b9_bace919e6517.slice/crio-7b759791a78a0648bd4c78af2943b603b5931ca3a0706b93403f60b27deddcbf WatchSource:0}: Error finding container 7b759791a78a0648bd4c78af2943b603b5931ca3a0706b93403f60b27deddcbf: Status 404 returned error can't find the container with id 7b759791a78a0648bd4c78af2943b603b5931ca3a0706b93403f60b27deddcbf Apr 22 21:20:46.610595 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:46.610548 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" event={"ID":"cceb139a-05b1-40ab-84b9-bace919e6517","Type":"ContainerStarted","Data":"17f0b6b1593fd427c25cf9dccec14c843a14a9d3084f686c5d9269c3836b3107"} Apr 22 21:20:46.610595 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:46.610599 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" event={"ID":"cceb139a-05b1-40ab-84b9-bace919e6517","Type":"ContainerStarted","Data":"7b759791a78a0648bd4c78af2943b603b5931ca3a0706b93403f60b27deddcbf"} Apr 22 21:20:52.636704 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:52.636666 2566 generic.go:358] "Generic (PLEG): container finished" podID="cceb139a-05b1-40ab-84b9-bace919e6517" containerID="17f0b6b1593fd427c25cf9dccec14c843a14a9d3084f686c5d9269c3836b3107" exitCode=0 Apr 22 21:20:52.637286 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:52.636715 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" event={"ID":"cceb139a-05b1-40ab-84b9-bace919e6517","Type":"ContainerDied","Data":"17f0b6b1593fd427c25cf9dccec14c843a14a9d3084f686c5d9269c3836b3107"} Apr 22 21:20:53.643398 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:53.643358 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" event={"ID":"cceb139a-05b1-40ab-84b9-bace919e6517","Type":"ContainerStarted","Data":"6718b55c397a93c4ff33fa04ef7a65dc587b4b195d293654e3a28cf70cbeeb0f"} Apr 22 21:20:53.643974 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:53.643617 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:20:53.659523 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:20:53.659466 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" podStartSLOduration=8.421369483 podStartE2EDuration="8.659450691s" podCreationTimestamp="2026-04-22 21:20:45 +0000 UTC" firstStartedPulling="2026-04-22 21:20:52.638064966 +0000 UTC m=+709.994879707" lastFinishedPulling="2026-04-22 21:20:52.876146185 +0000 UTC m=+710.232960915" observedRunningTime="2026-04-22 21:20:53.658114319 +0000 UTC m=+711.014929070" watchObservedRunningTime="2026-04-22 21:20:53.659450691 +0000 UTC m=+711.016265442" Apr 22 21:21:04.662819 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:21:04.662787 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr" Apr 22 21:23:08.865078 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:08.865038 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-9dd94f74c-8g925"] Apr 22 21:23:08.865534 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:08.865273 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-9dd94f74c-8g925" podUID="680b5706-4a69-4171-802e-f4ec36919d2b" containerName="manager" containerID="cri-o://89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1" gracePeriod=10 Apr 22 21:23:09.105516 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.105484 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-9dd94f74c-8g925" Apr 22 21:23:09.155085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.154995 2566 generic.go:358] "Generic (PLEG): container finished" podID="680b5706-4a69-4171-802e-f4ec36919d2b" containerID="89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1" exitCode=0 Apr 22 21:23:09.155085 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.155068 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-9dd94f74c-8g925" event={"ID":"680b5706-4a69-4171-802e-f4ec36919d2b","Type":"ContainerDied","Data":"89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1"} Apr 22 21:23:09.155254 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.155105 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-9dd94f74c-8g925" event={"ID":"680b5706-4a69-4171-802e-f4ec36919d2b","Type":"ContainerDied","Data":"36a4b47bd0e4a1f383f49cabcad8a630e6e639d9c4f11ff6f17084210414794f"} Apr 22 21:23:09.155254 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.155113 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-9dd94f74c-8g925" Apr 22 21:23:09.155254 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.155125 2566 scope.go:117] "RemoveContainer" containerID="89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1" Apr 22 21:23:09.165756 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.165701 2566 scope.go:117] "RemoveContainer" containerID="89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1" Apr 22 21:23:09.166315 ip-10-0-134-137 kubenswrapper[2566]: E0422 21:23:09.166288 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1\": container with ID starting with 89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1 not found: ID does not exist" containerID="89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1" Apr 22 21:23:09.166435 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.166325 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1"} err="failed to get container status \"89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1\": rpc error: code = NotFound desc = could not find container \"89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1\": container with ID starting with 89375dc823b6f31f3e98525a4696ec571901ae20b2089f5b93a6af95f0f3aad1 not found: ID does not exist" Apr 22 21:23:09.255250 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.255226 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqbsp\" (UniqueName: \"kubernetes.io/projected/680b5706-4a69-4171-802e-f4ec36919d2b-kube-api-access-fqbsp\") pod \"680b5706-4a69-4171-802e-f4ec36919d2b\" (UID: \"680b5706-4a69-4171-802e-f4ec36919d2b\") " Apr 22 21:23:09.257220 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.257194 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/680b5706-4a69-4171-802e-f4ec36919d2b-kube-api-access-fqbsp" (OuterVolumeSpecName: "kube-api-access-fqbsp") pod "680b5706-4a69-4171-802e-f4ec36919d2b" (UID: "680b5706-4a69-4171-802e-f4ec36919d2b"). InnerVolumeSpecName "kube-api-access-fqbsp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:23:09.356816 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.356786 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fqbsp\" (UniqueName: \"kubernetes.io/projected/680b5706-4a69-4171-802e-f4ec36919d2b-kube-api-access-fqbsp\") on node \"ip-10-0-134-137.ec2.internal\" DevicePath \"\"" Apr 22 21:23:09.475322 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.475285 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-9dd94f74c-8g925"] Apr 22 21:23:09.477283 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:09.477262 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-9dd94f74c-8g925"] Apr 22 21:23:10.145850 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.145814 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-9dd94f74c-nx6bw"] Apr 22 21:23:10.146524 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.146500 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="680b5706-4a69-4171-802e-f4ec36919d2b" containerName="manager" Apr 22 21:23:10.146614 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.146527 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="680b5706-4a69-4171-802e-f4ec36919d2b" containerName="manager" Apr 22 21:23:10.146614 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.146605 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="680b5706-4a69-4171-802e-f4ec36919d2b" containerName="manager" Apr 22 21:23:10.150723 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.150693 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-9dd94f74c-nx6bw" Apr 22 21:23:10.152642 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.152616 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-wvptf\"" Apr 22 21:23:10.158685 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.158664 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-9dd94f74c-nx6bw"] Apr 22 21:23:10.262366 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.262327 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcss9\" (UniqueName: \"kubernetes.io/projected/6d698428-b4e4-41dd-985f-57494492421e-kube-api-access-jcss9\") pod \"maas-controller-9dd94f74c-nx6bw\" (UID: \"6d698428-b4e4-41dd-985f-57494492421e\") " pod="opendatahub/maas-controller-9dd94f74c-nx6bw" Apr 22 21:23:10.363691 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.363653 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcss9\" (UniqueName: \"kubernetes.io/projected/6d698428-b4e4-41dd-985f-57494492421e-kube-api-access-jcss9\") pod \"maas-controller-9dd94f74c-nx6bw\" (UID: \"6d698428-b4e4-41dd-985f-57494492421e\") " pod="opendatahub/maas-controller-9dd94f74c-nx6bw" Apr 22 21:23:10.370867 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.370837 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcss9\" (UniqueName: \"kubernetes.io/projected/6d698428-b4e4-41dd-985f-57494492421e-kube-api-access-jcss9\") pod \"maas-controller-9dd94f74c-nx6bw\" (UID: \"6d698428-b4e4-41dd-985f-57494492421e\") " pod="opendatahub/maas-controller-9dd94f74c-nx6bw" Apr 22 21:23:10.463111 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.463014 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-9dd94f74c-nx6bw" Apr 22 21:23:10.583125 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:10.583099 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-9dd94f74c-nx6bw"] Apr 22 21:23:10.584860 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:23:10.584820 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d698428_b4e4_41dd_985f_57494492421e.slice/crio-c2b092081f4d297a9055212d53170c4713a0586c987439d470b66495e80ceebe WatchSource:0}: Error finding container c2b092081f4d297a9055212d53170c4713a0586c987439d470b66495e80ceebe: Status 404 returned error can't find the container with id c2b092081f4d297a9055212d53170c4713a0586c987439d470b66495e80ceebe Apr 22 21:23:11.133808 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:11.133775 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="680b5706-4a69-4171-802e-f4ec36919d2b" path="/var/lib/kubelet/pods/680b5706-4a69-4171-802e-f4ec36919d2b/volumes" Apr 22 21:23:11.165737 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:11.165705 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-9dd94f74c-nx6bw" event={"ID":"6d698428-b4e4-41dd-985f-57494492421e","Type":"ContainerStarted","Data":"7b268f6fb0e7bc466c3f7e24011874d0131b807122c7d3b27ac69ccb8cc2274d"} Apr 22 21:23:11.165737 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:11.165739 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-9dd94f74c-nx6bw" event={"ID":"6d698428-b4e4-41dd-985f-57494492421e","Type":"ContainerStarted","Data":"c2b092081f4d297a9055212d53170c4713a0586c987439d470b66495e80ceebe"} Apr 22 21:23:11.166119 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:11.165763 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-9dd94f74c-nx6bw" Apr 22 21:23:11.178462 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:11.178394 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-9dd94f74c-nx6bw" podStartSLOduration=0.761772713 podStartE2EDuration="1.178379897s" podCreationTimestamp="2026-04-22 21:23:10 +0000 UTC" firstStartedPulling="2026-04-22 21:23:10.586099512 +0000 UTC m=+847.942914247" lastFinishedPulling="2026-04-22 21:23:11.002706701 +0000 UTC m=+848.359521431" observedRunningTime="2026-04-22 21:23:11.178206903 +0000 UTC m=+848.535021657" watchObservedRunningTime="2026-04-22 21:23:11.178379897 +0000 UTC m=+848.535194648" Apr 22 21:23:22.175333 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:23:22.175298 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-9dd94f74c-nx6bw" Apr 22 21:24:03.066840 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:24:03.066763 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:24:03.070094 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:24:03.070072 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:29:03.096267 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:29:03.096241 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:29:03.101557 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:29:03.101535 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:30:58.895182 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:30:58.895144 2566 generic.go:358] "Generic (PLEG): container finished" podID="591364e1-7e0f-4f15-8ee7-af576f419ab0" containerID="37ca1c3f69e50ff7ffbeb744d4e686c8998c367a2fabf18b6da01be6763831c7" exitCode=1 Apr 22 21:30:58.895617 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:30:58.895211 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" event={"ID":"591364e1-7e0f-4f15-8ee7-af576f419ab0","Type":"ContainerDied","Data":"37ca1c3f69e50ff7ffbeb744d4e686c8998c367a2fabf18b6da01be6763831c7"} Apr 22 21:30:58.895617 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:30:58.895539 2566 scope.go:117] "RemoveContainer" containerID="37ca1c3f69e50ff7ffbeb744d4e686c8998c367a2fabf18b6da01be6763831c7" Apr 22 21:30:58.895857 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:30:58.895843 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:30:59.900127 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:30:59.900090 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" event={"ID":"591364e1-7e0f-4f15-8ee7-af576f419ab0","Type":"ContainerStarted","Data":"f262e6f3ccb7fffe64fa6588b718f4c98d7bc9fd8ccb0cf190fbca0053720b8c"} Apr 22 21:30:59.900539 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:30:59.900299 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:31:30.909055 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:31:30.908972 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-stmtk" Apr 22 21:34:03.125702 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:34:03.125671 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:34:03.135739 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:34:03.135717 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:39:03.164404 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:39:03.164326 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:39:03.172487 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:39:03.172466 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:44:03.195160 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:03.195127 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:44:03.204108 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:03.204083 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:44:06.007291 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:06.007253 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-stmtk_591364e1-7e0f-4f15-8ee7-af576f419ab0/manager/1.log" Apr 22 21:44:06.238768 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:06.238734 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-9dd94f74c-nx6bw_6d698428-b4e4-41dd-985f-57494492421e/manager/0.log" Apr 22 21:44:06.356711 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:06.356673 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-rbgsq_75d67504-5aa5-459b-8ec5-a68199bd0be6/manager/0.log" Apr 22 21:44:06.466232 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:06.466203 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-65d8664856-2kn6j_1d3599da-edfb-4118-bf7e-9534da06b88b/manager/0.log" Apr 22 21:44:06.816402 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:06.816371 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-vrg5c_ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025/postgres/0.log" Apr 22 21:44:07.568900 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.568864 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v_bab93fad-8ef3-4d6c-baf4-8a9a2431581f/pull/0.log" Apr 22 21:44:07.574549 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.574508 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v_bab93fad-8ef3-4d6c-baf4-8a9a2431581f/extract/0.log" Apr 22 21:44:07.579892 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.579869 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v_bab93fad-8ef3-4d6c-baf4-8a9a2431581f/util/0.log" Apr 22 21:44:07.688473 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.688438 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5_bc060510-4b85-48e8-8bb0-4a4d470d8cc2/extract/0.log" Apr 22 21:44:07.693759 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.693737 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5_bc060510-4b85-48e8-8bb0-4a4d470d8cc2/util/0.log" Apr 22 21:44:07.699465 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.699441 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5_bc060510-4b85-48e8-8bb0-4a4d470d8cc2/pull/0.log" Apr 22 21:44:07.805578 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.805547 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw_221ac19a-80b1-431f-9b74-31be7bb3ab98/util/0.log" Apr 22 21:44:07.811078 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.811029 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw_221ac19a-80b1-431f-9b74-31be7bb3ab98/pull/0.log" Apr 22 21:44:07.816300 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.816271 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw_221ac19a-80b1-431f-9b74-31be7bb3ab98/extract/0.log" Apr 22 21:44:07.922914 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.922828 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d_f85a7385-6408-446c-9911-6446129baf21/extract/0.log" Apr 22 21:44:07.927687 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.927665 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d_f85a7385-6408-446c-9911-6446129baf21/util/0.log" Apr 22 21:44:07.932333 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:07.932312 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d_f85a7385-6408-446c-9911-6446129baf21/pull/0.log" Apr 22 21:44:08.161997 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:08.161958 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-msf8d_7eef0b82-ed6e-4984-ab19-0c4b11aa7579/manager/0.log" Apr 22 21:44:09.281782 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:09.281748 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rldff_16586da0-c3ba-4397-8b50-5817ee975d70/discovery/0.log" Apr 22 21:44:09.492470 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:09.492404 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b8c5f7f67-hvd97_d3c57abc-8177-4236-b215-e1fcf1fa366c/kube-auth-proxy/0.log" Apr 22 21:44:10.047060 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:10.047032 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5_5caa91a4-eb80-43c5-8902-f7ab4d118bc0/storage-initializer/0.log" Apr 22 21:44:10.054014 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:10.053983 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-hcpt5_5caa91a4-eb80-43c5-8902-f7ab4d118bc0/main/0.log" Apr 22 21:44:10.157235 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:10.157197 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78_56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd/storage-initializer/0.log" Apr 22 21:44:10.164454 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:10.164424 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-7pq78_56da6d8f-0bf7-4c9a-9ebf-3424bcccedbd/main/0.log" Apr 22 21:44:10.379521 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:10.379486 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr_cceb139a-05b1-40ab-84b9-bace919e6517/storage-initializer/0.log" Apr 22 21:44:10.386428 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:10.386380 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccrx9qr_cceb139a-05b1-40ab-84b9-bace919e6517/main/0.log" Apr 22 21:44:17.217767 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:17.217726 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gsqgw_94c92ddb-384e-4003-80b5-1b032afdc994/global-pull-secret-syncer/0.log" Apr 22 21:44:17.381883 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:17.381847 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fcxqm_8c9bd2f4-6d8d-4016-b3c4-d2b398b5abc9/konnectivity-agent/0.log" Apr 22 21:44:17.401662 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:17.401633 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-137.ec2.internal_c6c453189c3f15e838fbba5937c30555/haproxy/0.log" Apr 22 21:44:21.141630 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.141592 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v_bab93fad-8ef3-4d6c-baf4-8a9a2431581f/extract/0.log" Apr 22 21:44:21.157720 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.157683 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v_bab93fad-8ef3-4d6c-baf4-8a9a2431581f/util/0.log" Apr 22 21:44:21.187305 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.187276 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7592sq7v_bab93fad-8ef3-4d6c-baf4-8a9a2431581f/pull/0.log" Apr 22 21:44:21.211599 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.211562 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5_bc060510-4b85-48e8-8bb0-4a4d470d8cc2/extract/0.log" Apr 22 21:44:21.228124 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.228096 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5_bc060510-4b85-48e8-8bb0-4a4d470d8cc2/util/0.log" Apr 22 21:44:21.245112 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.245078 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0rpnh5_bc060510-4b85-48e8-8bb0-4a4d470d8cc2/pull/0.log" Apr 22 21:44:21.270708 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.270682 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw_221ac19a-80b1-431f-9b74-31be7bb3ab98/extract/0.log" Apr 22 21:44:21.289528 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.289500 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw_221ac19a-80b1-431f-9b74-31be7bb3ab98/util/0.log" Apr 22 21:44:21.306046 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.305998 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed735xggw_221ac19a-80b1-431f-9b74-31be7bb3ab98/pull/0.log" Apr 22 21:44:21.329129 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.329098 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d_f85a7385-6408-446c-9911-6446129baf21/extract/0.log" Apr 22 21:44:21.345750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.345724 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d_f85a7385-6408-446c-9911-6446129baf21/util/0.log" Apr 22 21:44:21.363220 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.363193 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1km28d_f85a7385-6408-446c-9911-6446129baf21/pull/0.log" Apr 22 21:44:21.650218 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:21.650183 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-msf8d_7eef0b82-ed6e-4984-ab19-0c4b11aa7579/manager/0.log" Apr 22 21:44:23.718912 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:23.718881 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tp9q7_5fa15de5-6550-43dc-afb0-2feb4e44de89/node-exporter/0.log" Apr 22 21:44:23.734347 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:23.734320 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tp9q7_5fa15de5-6550-43dc-afb0-2feb4e44de89/kube-rbac-proxy/0.log" Apr 22 21:44:23.750992 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:23.750962 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tp9q7_5fa15de5-6550-43dc-afb0-2feb4e44de89/init-textfile/0.log" Apr 22 21:44:26.101066 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.101020 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx"] Apr 22 21:44:26.104824 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.104799 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.106701 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.106672 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ddjjb\"/\"openshift-service-ca.crt\"" Apr 22 21:44:26.107351 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.107327 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ddjjb\"/\"kube-root-ca.crt\"" Apr 22 21:44:26.107598 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.107580 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-ddjjb\"/\"default-dockercfg-kv5l8\"" Apr 22 21:44:26.113429 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.113380 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx"] Apr 22 21:44:26.228652 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.228619 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-lib-modules\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.228848 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.228665 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-sys\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.228848 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.228701 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-proc\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.228848 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.228717 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md7sr\" (UniqueName: \"kubernetes.io/projected/69a1a95e-be16-4dd9-be4f-2c748de9976d-kube-api-access-md7sr\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.228848 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.228746 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-podres\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.329579 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.329541 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-podres\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.329768 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.329622 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-lib-modules\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.329768 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.329653 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-sys\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.329768 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.329669 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-proc\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.329768 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.329687 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-md7sr\" (UniqueName: \"kubernetes.io/projected/69a1a95e-be16-4dd9-be4f-2c748de9976d-kube-api-access-md7sr\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.329768 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.329723 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-podres\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.329933 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.329766 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-sys\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.329933 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.329785 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-proc\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.329933 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.329793 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69a1a95e-be16-4dd9-be4f-2c748de9976d-lib-modules\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.337363 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.337328 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-md7sr\" (UniqueName: \"kubernetes.io/projected/69a1a95e-be16-4dd9-be4f-2c748de9976d-kube-api-access-md7sr\") pod \"perf-node-gather-daemonset-d6rsx\" (UID: \"69a1a95e-be16-4dd9-be4f-2c748de9976d\") " pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.415562 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.415472 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.545231 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.545164 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx"] Apr 22 21:44:26.547779 ip-10-0-134-137 kubenswrapper[2566]: W0422 21:44:26.547749 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod69a1a95e_be16_4dd9_be4f_2c748de9976d.slice/crio-eff1f70fe22c7586f1400c2be0ab3d7fbc4ad06460446410190d3faad92db904 WatchSource:0}: Error finding container eff1f70fe22c7586f1400c2be0ab3d7fbc4ad06460446410190d3faad92db904: Status 404 returned error can't find the container with id eff1f70fe22c7586f1400c2be0ab3d7fbc4ad06460446410190d3faad92db904 Apr 22 21:44:26.549542 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.549521 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 21:44:26.857790 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.857754 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" event={"ID":"69a1a95e-be16-4dd9-be4f-2c748de9976d","Type":"ContainerStarted","Data":"3a7b804f1aef5f832c98fcbf462529eea5f69700b27e985b07516d60626429a0"} Apr 22 21:44:26.857790 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.857790 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" event={"ID":"69a1a95e-be16-4dd9-be4f-2c748de9976d","Type":"ContainerStarted","Data":"eff1f70fe22c7586f1400c2be0ab3d7fbc4ad06460446410190d3faad92db904"} Apr 22 21:44:26.858004 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.857819 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:26.873524 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:26.873472 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" podStartSLOduration=0.873458318 podStartE2EDuration="873.458318ms" podCreationTimestamp="2026-04-22 21:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:44:26.871509414 +0000 UTC m=+2124.228324167" watchObservedRunningTime="2026-04-22 21:44:26.873458318 +0000 UTC m=+2124.230273071" Apr 22 21:44:27.931550 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:27.931521 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lszg5_1161b36d-e8f6-417a-975a-80f7d1eba5e1/dns/0.log" Apr 22 21:44:27.948712 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:27.948683 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-lszg5_1161b36d-e8f6-417a-975a-80f7d1eba5e1/kube-rbac-proxy/0.log" Apr 22 21:44:28.045897 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:28.045863 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-t4hx2_83539234-7602-4c9d-a9c8-05dca158b65b/dns-node-resolver/0.log" Apr 22 21:44:28.512781 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:28.512748 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-jq6nq_fbd1aca7-dece-43f0-b914-b6993f56f39a/node-ca/0.log" Apr 22 21:44:29.427216 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:29.427183 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-rldff_16586da0-c3ba-4397-8b50-5817ee975d70/discovery/0.log" Apr 22 21:44:29.462750 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:29.462720 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7b8c5f7f67-hvd97_d3c57abc-8177-4236-b215-e1fcf1fa366c/kube-auth-proxy/0.log" Apr 22 21:44:30.071004 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:30.070972 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wzzzt_18f0e0a7-a91f-432d-bcb0-5e33fb885077/serve-healthcheck-canary/0.log" Apr 22 21:44:30.655957 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:30.655919 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ldx4t_6301c666-2041-4fac-a690-1918445b7057/kube-rbac-proxy/0.log" Apr 22 21:44:30.675497 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:30.675467 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ldx4t_6301c666-2041-4fac-a690-1918445b7057/exporter/0.log" Apr 22 21:44:30.693335 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:30.693307 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-ldx4t_6301c666-2041-4fac-a690-1918445b7057/extractor/0.log" Apr 22 21:44:32.551131 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:32.551088 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-stmtk_591364e1-7e0f-4f15-8ee7-af576f419ab0/manager/1.log" Apr 22 21:44:32.557930 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:32.557897 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-stmtk_591364e1-7e0f-4f15-8ee7-af576f419ab0/manager/0.log" Apr 22 21:44:32.661083 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:32.661050 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-9dd94f74c-nx6bw_6d698428-b4e4-41dd-985f-57494492421e/manager/0.log" Apr 22 21:44:32.700094 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:32.700057 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-rbgsq_75d67504-5aa5-459b-8ec5-a68199bd0be6/manager/0.log" Apr 22 21:44:32.717612 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:32.717583 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-65d8664856-2kn6j_1d3599da-edfb-4118-bf7e-9534da06b88b/manager/0.log" Apr 22 21:44:32.800062 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:32.800027 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-vrg5c_ee48ec0c-cc94-4b7f-9f3a-f4f896fe6025/postgres/0.log" Apr 22 21:44:32.871990 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:32.871963 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-ddjjb/perf-node-gather-daemonset-d6rsx" Apr 22 21:44:39.773653 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:39.773617 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w54p8_d86eb1d3-4e06-4952-905d-3bc13ae4849b/kube-multus-additional-cni-plugins/0.log" Apr 22 21:44:39.791384 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:39.791344 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w54p8_d86eb1d3-4e06-4952-905d-3bc13ae4849b/egress-router-binary-copy/0.log" Apr 22 21:44:39.806478 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:39.806445 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w54p8_d86eb1d3-4e06-4952-905d-3bc13ae4849b/cni-plugins/0.log" Apr 22 21:44:39.822130 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:39.822105 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w54p8_d86eb1d3-4e06-4952-905d-3bc13ae4849b/bond-cni-plugin/0.log" Apr 22 21:44:39.837696 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:39.837650 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w54p8_d86eb1d3-4e06-4952-905d-3bc13ae4849b/routeoverride-cni/0.log" Apr 22 21:44:39.852959 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:39.852931 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w54p8_d86eb1d3-4e06-4952-905d-3bc13ae4849b/whereabouts-cni-bincopy/0.log" Apr 22 21:44:39.868250 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:39.868220 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-w54p8_d86eb1d3-4e06-4952-905d-3bc13ae4849b/whereabouts-cni/0.log" Apr 22 21:44:39.923323 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:39.923285 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f9bbq_ba523d03-d12e-4172-89eb-9885c3215d06/kube-multus/0.log" Apr 22 21:44:40.036535 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:40.036432 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qvqz5_01c1be80-c7fc-433f-bf11-a97af5540866/network-metrics-daemon/0.log" Apr 22 21:44:40.049536 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:40.049508 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qvqz5_01c1be80-c7fc-433f-bf11-a97af5540866/kube-rbac-proxy/0.log" Apr 22 21:44:41.327145 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:41.327114 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-controller/0.log" Apr 22 21:44:41.339803 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:41.339774 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/0.log" Apr 22 21:44:41.358931 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:41.358898 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovn-acl-logging/1.log" Apr 22 21:44:41.377630 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:41.377599 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/kube-rbac-proxy-node/0.log" Apr 22 21:44:41.396074 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:41.396019 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 21:44:41.408559 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:41.408529 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/northd/0.log" Apr 22 21:44:41.423120 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:41.423098 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/nbdb/0.log" Apr 22 21:44:41.437844 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:41.437818 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/sbdb/0.log" Apr 22 21:44:41.598498 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:41.598399 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wqvs8_aee77461-dec9-4db2-99dc-b345c2c300bb/ovnkube-controller/0.log" Apr 22 21:44:42.649119 ip-10-0-134-137 kubenswrapper[2566]: I0422 21:44:42.649082 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-b4vrr_f1c2f55b-0450-43f7-b50e-57b4c3d15108/network-check-target-container/0.log"