Apr 21 15:59:09.607481 ip-10-0-129-96 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 15:59:09.607493 ip-10-0-129-96 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 15:59:09.607500 ip-10-0-129-96 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 15:59:09.607737 ip-10-0-129-96 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 15:59:19.694297 ip-10-0-129-96 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 15:59:19.694315 ip-10-0-129-96 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 15b4e29eda0a4de2b83dd3aa4b6bf432 -- Apr 21 16:01:43.245050 ip-10-0-129-96 systemd[1]: Starting Kubernetes Kubelet... Apr 21 16:01:43.812277 ip-10-0-129-96 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 16:01:43.812277 ip-10-0-129-96 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 16:01:43.812277 ip-10-0-129-96 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 16:01:43.812277 ip-10-0-129-96 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 16:01:43.812277 ip-10-0-129-96 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 16:01:43.815058 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.814970 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 16:01:43.818006 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.817991 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:43.818006 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818006 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818011 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818015 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818018 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818021 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818025 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818027 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818030 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818033 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818036 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818039 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818042 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818044 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818047 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818050 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818053 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818056 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818058 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818062 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:43.818073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818065 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818067 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818070 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818073 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818075 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818078 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818082 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818085 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818088 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818091 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818094 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818097 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818099 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818102 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818105 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818107 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818110 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818112 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818115 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818117 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:43.818523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818120 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818123 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818127 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818130 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818133 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818136 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818138 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818141 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818143 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818148 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818151 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818155 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818158 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818161 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818164 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818167 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818171 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818174 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818177 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818179 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:43.819019 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818182 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818185 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818188 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818190 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818193 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818196 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818199 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818202 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818205 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818207 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818210 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818212 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818215 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818218 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818220 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818223 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818226 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818228 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818231 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:43.819510 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818234 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818237 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818240 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818242 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818245 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818248 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818250 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818673 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818679 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818682 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818685 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818689 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818692 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818694 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818697 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818700 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818702 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818705 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818708 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818712 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:43.820036 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818715 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818717 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818720 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818722 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818725 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818728 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818730 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818733 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818736 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818739 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818741 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818743 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818746 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818748 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818751 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818754 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818756 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818759 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818761 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818765 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:43.820523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818768 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818784 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818787 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818790 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818793 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818796 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818798 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818801 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818803 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818806 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818809 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818812 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818815 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818817 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818820 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818823 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818827 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818831 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818834 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818836 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:43.821042 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818839 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818841 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818844 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818847 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818850 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818853 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818856 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818858 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818862 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818866 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818870 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818874 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818877 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818880 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818882 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818885 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818888 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818891 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818893 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:43.821528 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818896 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818898 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818901 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818904 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818906 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818909 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818912 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818915 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818917 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818920 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818922 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818925 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818927 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.818930 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819641 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819650 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819656 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819661 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819666 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819670 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819674 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 16:01:43.822014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819679 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819682 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819685 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819690 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819693 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819697 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819700 2577 flags.go:64] FLAG: --cgroup-root="" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819703 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819706 2577 flags.go:64] FLAG: --client-ca-file="" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819709 2577 flags.go:64] FLAG: --cloud-config="" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819711 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819714 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819719 2577 flags.go:64] FLAG: --cluster-domain="" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819722 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819725 2577 flags.go:64] FLAG: --config-dir="" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819728 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819731 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819735 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819739 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819742 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819745 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819756 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819760 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819763 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819766 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 16:01:43.822543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819770 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819785 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819789 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819792 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819795 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819798 2577 flags.go:64] FLAG: --enable-server="true" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819801 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819805 2577 flags.go:64] FLAG: --event-burst="100" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819809 2577 flags.go:64] FLAG: --event-qps="50" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819812 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819815 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819818 2577 flags.go:64] FLAG: --eviction-hard="" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819822 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819825 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819828 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819831 2577 flags.go:64] FLAG: --eviction-soft="" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819834 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819837 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819840 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819843 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819846 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819849 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819852 2577 flags.go:64] FLAG: --feature-gates="" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819856 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819859 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 16:01:43.823174 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819862 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819866 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819869 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819872 2577 flags.go:64] FLAG: --help="false" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819875 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-129-96.ec2.internal" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819878 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819881 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819884 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819887 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819891 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819894 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819897 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819900 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819903 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819905 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819909 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819912 2577 flags.go:64] FLAG: --kube-reserved="" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819915 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819917 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819921 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819923 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819926 2577 flags.go:64] FLAG: --lock-file="" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819929 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819932 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 16:01:43.823894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819935 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819940 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819943 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819946 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819949 2577 flags.go:64] FLAG: --logging-format="text" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819957 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819960 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819964 2577 flags.go:64] FLAG: --manifest-url="" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819966 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819971 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819981 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819986 2577 flags.go:64] FLAG: --max-pods="110" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819989 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819992 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819995 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.819998 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820001 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820004 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820008 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820016 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820019 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820022 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820025 2577 flags.go:64] FLAG: --pod-cidr="" Apr 21 16:01:43.824473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820028 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820034 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820037 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820040 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820043 2577 flags.go:64] FLAG: --port="10250" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820047 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820050 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-069a12c574818a428" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820054 2577 flags.go:64] FLAG: --qos-reserved="" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820057 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820060 2577 flags.go:64] FLAG: --register-node="true" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820063 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820066 2577 flags.go:64] FLAG: --register-with-taints="" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820069 2577 flags.go:64] FLAG: --registry-burst="10" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820072 2577 flags.go:64] FLAG: --registry-qps="5" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820075 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820079 2577 flags.go:64] FLAG: --reserved-memory="" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820083 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820086 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820089 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820092 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820095 2577 flags.go:64] FLAG: --runonce="false" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820098 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820101 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820104 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820109 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820112 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 16:01:43.825075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820115 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820118 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820122 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820124 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820128 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820130 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820133 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820136 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820139 2577 flags.go:64] FLAG: --system-cgroups="" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820143 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820149 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820151 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820154 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820159 2577 flags.go:64] FLAG: --tls-min-version="" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820162 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820165 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820167 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820170 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820174 2577 flags.go:64] FLAG: --v="2" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820178 2577 flags.go:64] FLAG: --version="false" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820182 2577 flags.go:64] FLAG: --vmodule="" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820188 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.820192 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820289 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820293 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:43.825698 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820296 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820299 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820302 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820305 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820308 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820312 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820315 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820317 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820320 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820322 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820325 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820328 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820330 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820334 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820339 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820342 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820345 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820347 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820350 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:43.826299 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820353 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820355 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820358 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820361 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820363 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820366 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820368 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820371 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820374 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820378 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820380 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820383 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820386 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820388 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820391 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820393 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820396 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820399 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820402 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820405 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:43.826856 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820408 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820410 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820413 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820415 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820418 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820420 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820423 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820425 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820428 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820430 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820433 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820436 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820438 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820441 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820443 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820446 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820448 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820451 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820453 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820456 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:43.827607 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820459 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820463 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820465 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820468 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820470 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820473 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820475 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820478 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820480 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820484 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820491 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820493 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820496 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820498 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820501 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820504 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820506 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820509 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820512 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820515 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:43.828490 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820517 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820520 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820523 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820528 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.820531 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.821580 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.829177 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.829198 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829302 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829310 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829316 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829320 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829325 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829329 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829334 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:43.829418 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829338 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829342 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829347 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829352 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829357 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829362 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829366 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829371 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829375 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829379 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829384 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829388 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829392 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829397 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829401 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829405 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829410 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829414 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829417 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829421 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:43.830103 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829426 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829430 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829434 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829438 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829443 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829448 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829452 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829456 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829460 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829464 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829468 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829473 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829477 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829482 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829486 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829491 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829496 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829500 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829504 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829509 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:43.830752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829513 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829517 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829521 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829526 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829529 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829534 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829538 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829542 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829546 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829550 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829554 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829559 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829563 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829567 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829571 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829575 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829579 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829586 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829590 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:43.831333 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829595 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829602 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829609 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829613 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829618 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829622 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829628 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829634 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829638 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829643 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829648 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829652 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829656 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829660 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829664 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829668 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829672 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829677 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829683 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:43.831942 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829689 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.829697 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829877 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829887 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829893 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829897 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829902 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829906 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829912 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829916 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829921 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829925 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829931 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829936 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829941 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 16:01:43.832708 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829945 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829949 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829953 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829958 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829962 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829967 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829971 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829975 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829979 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829984 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829988 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829992 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.829996 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830000 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830004 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830009 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830013 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830017 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830021 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830025 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 16:01:43.833306 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830029 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830033 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830037 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830042 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830046 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830050 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830054 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830058 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830062 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830067 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830072 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830077 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830081 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830085 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830089 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830094 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830098 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830102 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830106 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830110 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 16:01:43.833850 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830114 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830118 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830122 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830126 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830131 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830135 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830139 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830144 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830148 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830152 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830156 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830160 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830164 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830169 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830173 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830177 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830181 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830185 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830189 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830194 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 16:01:43.834411 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830198 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830203 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830207 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830212 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830216 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830220 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830224 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830228 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830233 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830237 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830242 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830248 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:43.830254 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.830262 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.831081 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 16:01:43.834957 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.833906 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 16:01:43.836195 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.836181 2577 server.go:1019] "Starting client certificate rotation" Apr 21 16:01:43.836304 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.836285 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 16:01:43.837662 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.837649 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 16:01:43.871940 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.871907 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 16:01:43.878439 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.878408 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 16:01:43.898054 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.898026 2577 log.go:25] "Validated CRI v1 runtime API" Apr 21 16:01:43.905016 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.904993 2577 log.go:25] "Validated CRI v1 image API" Apr 21 16:01:43.906294 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.906269 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 16:01:43.911324 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.911302 2577 fs.go:135] Filesystem UUIDs: map[1927e3a5-bb85-4bac-bb64-d2447f657c5d:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 b87650ac-c0ef-4d5a-9622-17c94657c7bc:/dev/nvme0n1p4] Apr 21 16:01:43.911398 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.911323 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 16:01:43.914437 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.914417 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 16:01:43.917265 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.917143 2577 manager.go:217] Machine: {Timestamp:2026-04-21 16:01:43.915022001 +0000 UTC m=+0.521924098 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104140 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e206ff3c52ff1ab3f1ff4aebdeffa SystemUUID:ec2e206f-f3c5-2ff1-ab3f-1ff4aebdeffa BootID:15b4e29e-da0a-4de2-b83d-d3aa4b6bf432 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:69:8a:ec:b3:17 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:69:8a:ec:b3:17 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:84:d0:7b:4c:4f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 16:01:43.917265 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.917260 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 16:01:43.917372 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.917345 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 16:01:43.918600 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.918579 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 16:01:43.918746 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.918601 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-96.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 16:01:43.918808 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.918751 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 16:01:43.918808 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.918761 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 16:01:43.918808 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.918789 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 16:01:43.919917 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.919906 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 16:01:43.921586 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.921576 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 21 16:01:43.921694 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.921685 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 16:01:43.925233 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.925221 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 21 16:01:43.925277 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.925241 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 16:01:43.925277 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.925252 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 16:01:43.925277 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.925263 2577 kubelet.go:397] "Adding apiserver pod source" Apr 21 16:01:43.925277 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.925275 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 16:01:43.926511 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.926496 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 16:01:43.926511 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.926514 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 16:01:43.930575 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.930556 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 16:01:43.932109 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.932094 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 16:01:43.934818 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934794 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 16:01:43.934894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934826 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 16:01:43.934894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934839 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 16:01:43.934894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934851 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 16:01:43.934894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934862 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 16:01:43.934894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934874 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 16:01:43.934894 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934886 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 16:01:43.935081 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934898 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 16:01:43.935081 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934915 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 16:01:43.935081 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934928 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 16:01:43.935081 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934944 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 16:01:43.935081 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.934962 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 16:01:43.937056 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.937043 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 16:01:43.937056 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.937056 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 16:01:43.941601 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.941586 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 16:01:43.941676 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.941635 2577 server.go:1295] "Started kubelet" Apr 21 16:01:43.941758 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.941728 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 16:01:43.941831 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.941743 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 16:01:43.941831 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.941813 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 16:01:43.942526 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.942507 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-96.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 16:01:43.942615 ip-10-0-129-96 systemd[1]: Started Kubernetes Kubelet. Apr 21 16:01:43.942713 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:43.942666 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-96.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 16:01:43.942713 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:43.942696 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 16:01:43.943017 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.943005 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 21 16:01:43.943229 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.943212 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 16:01:43.949992 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.949973 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 16:01:43.950434 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:43.948990 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-96.ec2.internal.18a86aa39cdcf65f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-96.ec2.internal,UID:ip-10-0-129-96.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-96.ec2.internal,},FirstTimestamp:2026-04-21 16:01:43.941600863 +0000 UTC m=+0.548502964,LastTimestamp:2026-04-21 16:01:43.941600863 +0000 UTC m=+0.548502964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-96.ec2.internal,}" Apr 21 16:01:43.950652 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.950631 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 16:01:43.951435 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.951405 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 16:01:43.951551 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.951446 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 16:01:43.951551 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.951552 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 16:01:43.951721 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.951632 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 21 16:01:43.951721 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.951643 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 21 16:01:43.951721 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:43.951690 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:43.952570 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.952547 2577 factory.go:153] Registering CRI-O factory Apr 21 16:01:43.952649 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.952614 2577 factory.go:223] Registration of the crio container factory successfully Apr 21 16:01:43.952710 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.952696 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 16:01:43.952749 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.952711 2577 factory.go:55] Registering systemd factory Apr 21 16:01:43.952749 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.952720 2577 factory.go:223] Registration of the systemd container factory successfully Apr 21 16:01:43.952749 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.952743 2577 factory.go:103] Registering Raw factory Apr 21 16:01:43.952900 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.952755 2577 manager.go:1196] Started watching for new ooms in manager Apr 21 16:01:43.953139 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:43.953115 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 16:01:43.953579 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.953459 2577 manager.go:319] Starting recovery of all containers Apr 21 16:01:43.958944 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:43.958919 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 16:01:43.959396 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:43.959369 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-96.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 16:01:43.964498 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.964343 2577 manager.go:324] Recovery completed Apr 21 16:01:43.970609 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.970593 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:43.973046 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.973029 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:43.973115 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.973061 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:43.973115 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.973071 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:43.973587 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.973566 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 16:01:43.973587 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.973578 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 16:01:43.973660 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.973597 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 21 16:01:43.975543 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:43.975482 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-96.ec2.internal.18a86aa39ebccb3c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-96.ec2.internal,UID:ip-10-0-129-96.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-96.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-96.ec2.internal,},FirstTimestamp:2026-04-21 16:01:43.9730471 +0000 UTC m=+0.579949197,LastTimestamp:2026-04-21 16:01:43.9730471 +0000 UTC m=+0.579949197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-96.ec2.internal,}" Apr 21 16:01:43.976547 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.976535 2577 policy_none.go:49] "None policy: Start" Apr 21 16:01:43.976587 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.976552 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 16:01:43.976587 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:43.976563 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 21 16:01:43.985186 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:43.985091 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-96.ec2.internal.18a86aa39ebd14b8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-96.ec2.internal,UID:ip-10-0-129-96.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-129-96.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-129-96.ec2.internal,},FirstTimestamp:2026-04-21 16:01:43.973065912 +0000 UTC m=+0.579968010,LastTimestamp:2026-04-21 16:01:43.973065912 +0000 UTC m=+0.579968010,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-96.ec2.internal,}" Apr 21 16:01:43.994548 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:43.994478 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-96.ec2.internal.18a86aa39ebd392d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-96.ec2.internal,UID:ip-10-0-129-96.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-129-96.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-129-96.ec2.internal,},FirstTimestamp:2026-04-21 16:01:43.973075245 +0000 UTC m=+0.579977343,LastTimestamp:2026-04-21 16:01:43.973075245 +0000 UTC m=+0.579977343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-96.ec2.internal,}" Apr 21 16:01:44.017470 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.017447 2577 manager.go:341] "Starting Device Plugin manager" Apr 21 16:01:44.019944 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.017500 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 16:01:44.019944 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.017515 2577 server.go:85] "Starting device plugin registration server" Apr 21 16:01:44.019944 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.017807 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 16:01:44.019944 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.017820 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 16:01:44.019944 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.017913 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 16:01:44.019944 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.017988 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 16:01:44.019944 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.017995 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 16:01:44.019944 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.018580 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 16:01:44.019944 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.018623 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.020283 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.020192 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t7nc8" Apr 21 16:01:44.032459 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.032437 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t7nc8" Apr 21 16:01:44.033695 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.033622 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-96.ec2.internal.18a86aa3a186297a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-96.ec2.internal,UID:ip-10-0-129-96.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-129-96.ec2.internal,},FirstTimestamp:2026-04-21 16:01:44.019798394 +0000 UTC m=+0.626700488,LastTimestamp:2026-04-21 16:01:44.019798394 +0000 UTC m=+0.626700488,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-96.ec2.internal,}" Apr 21 16:01:44.048006 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.047975 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 16:01:44.049286 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.049267 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 16:01:44.049366 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.049297 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 16:01:44.049366 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.049322 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 16:01:44.049366 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.049331 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 16:01:44.049366 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.049363 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 16:01:44.074761 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.074684 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:44.117966 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.117939 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:44.118924 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.118906 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:44.119014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.118936 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:44.119014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.118946 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:44.119014 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.118970 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.130912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.130895 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.130958 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.130917 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-96.ec2.internal\": node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.149895 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.149874 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal"] Apr 21 16:01:44.149964 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.149934 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:44.151397 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.151381 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:44.151460 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.151404 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:44.151460 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.151414 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:44.152870 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.152859 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:44.153018 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.153004 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.153062 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.153031 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:44.153516 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.153500 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:44.153560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.153530 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:44.153560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.153544 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:44.153620 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.153503 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:44.153651 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.153620 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:44.153651 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.153634 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:44.154876 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.154862 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.154950 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.154888 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 16:01:44.155494 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.155479 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientMemory" Apr 21 16:01:44.155577 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.155508 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 16:01:44.155577 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.155522 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeHasSufficientPID" Apr 21 16:01:44.159307 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.159294 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.174327 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.174305 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-96.ec2.internal\" not found" node="ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.178252 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.178235 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-96.ec2.internal\" not found" node="ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.253351 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.253317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0544071a2bc7a957493cde1dbb18d0ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal\" (UID: \"0544071a2bc7a957493cde1dbb18d0ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.253351 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.253349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0544071a2bc7a957493cde1dbb18d0ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal\" (UID: \"0544071a2bc7a957493cde1dbb18d0ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.253542 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.253368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ef7a67774dc3cec2ee7c5d8876aafbd3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-96.ec2.internal\" (UID: \"ef7a67774dc3cec2ee7c5d8876aafbd3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.259363 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.259339 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.354041 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.353973 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0544071a2bc7a957493cde1dbb18d0ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal\" (UID: \"0544071a2bc7a957493cde1dbb18d0ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.354041 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.354004 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0544071a2bc7a957493cde1dbb18d0ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal\" (UID: \"0544071a2bc7a957493cde1dbb18d0ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.354041 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.354023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ef7a67774dc3cec2ee7c5d8876aafbd3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-96.ec2.internal\" (UID: \"ef7a67774dc3cec2ee7c5d8876aafbd3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.354216 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.354092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0544071a2bc7a957493cde1dbb18d0ee-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal\" (UID: \"0544071a2bc7a957493cde1dbb18d0ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.354216 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.354167 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ef7a67774dc3cec2ee7c5d8876aafbd3-config\") pod \"kube-apiserver-proxy-ip-10-0-129-96.ec2.internal\" (UID: \"ef7a67774dc3cec2ee7c5d8876aafbd3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.354216 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.354207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0544071a2bc7a957493cde1dbb18d0ee-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal\" (UID: \"0544071a2bc7a957493cde1dbb18d0ee\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.360081 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.360061 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.460881 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.460834 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.477017 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.476992 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.482027 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.482011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal" Apr 21 16:01:44.561607 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.561572 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.662131 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.662055 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.762707 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.762667 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.835859 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.835831 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 16:01:44.836358 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.835965 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 16:01:44.863322 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.863291 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.950944 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.950918 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 16:01:44.963430 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:44.963410 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:44.973573 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.973553 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 16:01:44.996470 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:44.996447 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bngsk" Apr 21 16:01:45.005933 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.005916 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bngsk" Apr 21 16:01:45.035213 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.035168 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 15:56:44 +0000 UTC" deadline="2028-02-05 09:43:10.242437398 +0000 UTC" Apr 21 16:01:45.035314 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.035214 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15713h41m25.207227267s" Apr 21 16:01:45.063505 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:45.063478 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:45.146288 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:45.146257 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0544071a2bc7a957493cde1dbb18d0ee.slice/crio-e3eb962cba3d916769be3e0cef1b7b51e29b3b65e213cc6649fa5f2a03fe9863 WatchSource:0}: Error finding container e3eb962cba3d916769be3e0cef1b7b51e29b3b65e213cc6649fa5f2a03fe9863: Status 404 returned error can't find the container with id e3eb962cba3d916769be3e0cef1b7b51e29b3b65e213cc6649fa5f2a03fe9863 Apr 21 16:01:45.146553 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:45.146533 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef7a67774dc3cec2ee7c5d8876aafbd3.slice/crio-a7c7e15643129061e1950a020d701794ff6fb3df61772a0d1b28069207fcb5a5 WatchSource:0}: Error finding container a7c7e15643129061e1950a020d701794ff6fb3df61772a0d1b28069207fcb5a5: Status 404 returned error can't find the container with id a7c7e15643129061e1950a020d701794ff6fb3df61772a0d1b28069207fcb5a5 Apr 21 16:01:45.151295 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.151278 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:01:45.164153 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:45.164132 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:45.220939 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.220881 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:45.264395 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:45.264369 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:45.364880 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:45.364842 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:45.413093 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.413061 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:45.465855 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:45.465814 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-96.ec2.internal\" not found" Apr 21 16:01:45.475616 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.475557 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:45.551758 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.551708 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" Apr 21 16:01:45.570191 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.570163 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 16:01:45.570328 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.570290 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal" Apr 21 16:01:45.582537 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.582519 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 16:01:45.925991 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.925908 2577 apiserver.go:52] "Watching apiserver" Apr 21 16:01:45.934033 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.933720 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 16:01:45.935341 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.935226 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal","openshift-multus/network-metrics-daemon-qhl4l","openshift-network-operator/iptables-alerter-cpgvc","openshift-ovn-kubernetes/ovnkube-node-6rkhl","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt","openshift-image-registry/node-ca-8flks","openshift-multus/multus-4r2h9","openshift-multus/multus-additional-cni-plugins-8jszj","openshift-network-diagnostics/network-check-target-8gq94","kube-system/konnectivity-agent-729fq","kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal","openshift-cluster-node-tuning-operator/tuned-r7qx7","openshift-dns/node-resolver-86kzc"] Apr 21 16:01:45.938868 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.938817 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.940189 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.940169 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:45.940286 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:45.940242 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:01:45.940286 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.940276 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:45.943464 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.943413 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-txncd\"" Apr 21 16:01:45.943575 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.943459 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 16:01:45.943575 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.943466 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 16:01:45.943683 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.943583 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 16:01:45.943683 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.943604 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 16:01:45.943802 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.943678 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 16:01:45.943802 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.943766 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 16:01:45.943948 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.943816 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:01:45.944181 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.944127 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.944272 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.944223 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lfvlh\"" Apr 21 16:01:45.944473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.944426 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:45.946066 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.946043 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:45.948081 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.947826 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:45.948857 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.948813 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 16:01:45.949066 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.949045 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 16:01:45.949210 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.949194 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 16:01:45.949817 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.949531 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 16:01:45.949817 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.949577 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:45.949817 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:45.949660 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:01:45.949817 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.949749 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 16:01:45.950042 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.949890 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 16:01:45.950042 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.949957 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 16:01:45.950042 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.949997 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 16:01:45.950172 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.950061 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 16:01:45.950211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.950203 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 16:01:45.950298 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.950249 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fn8md\"" Apr 21 16:01:45.950298 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.950276 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 16:01:45.950743 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.950539 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-2zw7l\"" Apr 21 16:01:45.950743 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.950593 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-grrhp\"" Apr 21 16:01:45.950743 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.950683 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-sknb7\"" Apr 21 16:01:45.950743 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.950730 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 16:01:45.951061 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.951045 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 16:01:45.951120 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.951063 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 16:01:45.952606 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.952563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-729fq" Apr 21 16:01:45.956687 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.955314 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8t4mh\"" Apr 21 16:01:45.956687 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.955611 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 16:01:45.956687 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.955907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 16:01:45.957647 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.957628 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.959694 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.959365 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:45.960681 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.960645 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 16:01:45.960681 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.960672 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x5zds\"" Apr 21 16:01:45.960842 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.960755 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:01:45.961764 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.961745 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 16:01:45.961925 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.961904 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-zpz2n\"" Apr 21 16:01:45.962132 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.962114 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 16:01:45.964797 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964570 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f334a381-fb5b-413d-b975-0e2511378c95-konnectivity-ca\") pod \"konnectivity-agent-729fq\" (UID: \"f334a381-fb5b-413d-b975-0e2511378c95\") " pod="kube-system/konnectivity-agent-729fq" Apr 21 16:01:45.964797 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcctt\" (UniqueName: \"kubernetes.io/projected/59ea63ef-850a-4d2b-867e-080e5b551a72-kube-api-access-jcctt\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.964797 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964630 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-run-ovn\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.964797 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964654 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-cni-bin\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.964797 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964678 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs864\" (UniqueName: \"kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864\") pod \"network-check-target-8gq94\" (UID: \"92ed2261-b49e-45eb-9081-57d883cfcf5a\") " pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:45.964797 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964700 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-registration-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:45.964797 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964724 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-var-lib-kubelet\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.964797 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964747 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-cnibin\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.964797 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964783 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-hostroot\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06e52eb9-a2bb-4db2-8e4f-f435db21156c-ovnkube-config\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964835 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7640279c-4346-4f46-b222-3c04e5d7569e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964857 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-host\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgzlg\" (UniqueName: \"kubernetes.io/projected/1b6959b7-f233-41c1-ba9b-fd52091ee1be-kube-api-access-kgzlg\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964904 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qvvb\" (UniqueName: \"kubernetes.io/projected/d74dfe95-7664-4441-8273-cc22ee22d89f-kube-api-access-4qvvb\") pod \"node-ca-8flks\" (UID: \"d74dfe95-7664-4441-8273-cc22ee22d89f\") " pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964928 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-socket-dir-parent\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.964989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-daemon-config\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7640279c-4346-4f46-b222-3c04e5d7569e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqngn\" (UniqueName: \"kubernetes.io/projected/881adee0-d6c5-4372-babe-35f8b57591e6-kube-api-access-cqngn\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d74dfe95-7664-4441-8273-cc22ee22d89f-host\") pod \"node-ca-8flks\" (UID: \"d74dfe95-7664-4441-8273-cc22ee22d89f\") " pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965095 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06e52eb9-a2bb-4db2-8e4f-f435db21156c-ovn-node-metrics-cert\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-systemd\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.965211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965143 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-device-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965242 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-var-lib-cni-bin\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-conf-dir\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-run-multus-certs\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-etc-kubernetes\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965376 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-slash\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06e52eb9-a2bb-4db2-8e4f-f435db21156c-env-overrides\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06e52eb9-a2bb-4db2-8e4f-f435db21156c-ovnkube-script-lib\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9501ad15-36cb-45df-9373-b96d6a0e43cf-host-slash\") pod \"iptables-alerter-cpgvc\" (UID: \"9501ad15-36cb-45df-9373-b96d6a0e43cf\") " pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965570 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-cni-dir\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-os-release\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.965664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-kubelet\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965682 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-node-log\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965706 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-var-lib-kubelet\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965726 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b6959b7-f233-41c1-ba9b-fd52091ee1be-tmp\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-socket-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965787 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-var-lib-cni-multus\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965852 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-run-systemd\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-etc-openvswitch\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965899 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-cni-netd\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-sysctl-conf\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.965947 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-sys\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966010 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-etc-selinux\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-system-cni-dir\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966059 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-run-netns\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:45.966153 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-sysconfig\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-sys-fs\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966212 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966241 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9501ad15-36cb-45df-9373-b96d6a0e43cf-iptables-alerter-script\") pod \"iptables-alerter-cpgvc\" (UID: \"9501ad15-36cb-45df-9373-b96d6a0e43cf\") " pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966264 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-run-openvswitch\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966339 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-cnibin\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966388 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-run-netns\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966415 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ztbh\" (UniqueName: \"kubernetes.io/projected/06e52eb9-a2bb-4db2-8e4f-f435db21156c-kube-api-access-6ztbh\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59ea63ef-850a-4d2b-867e-080e5b551a72-cni-binary-copy\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr945\" (UniqueName: \"kubernetes.io/projected/7640279c-4346-4f46-b222-3c04e5d7569e-kube-api-access-tr945\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966685 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-kubernetes\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-lib-modules\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-run-k8s-cni-cncf-io\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:45.966853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966800 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-systemd-units\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966827 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-system-cni-dir\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-sysctl-d\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966882 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-run\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966903 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d74dfe95-7664-4441-8273-cc22ee22d89f-serviceca\") pod \"node-ca-8flks\" (UID: \"d74dfe95-7664-4441-8273-cc22ee22d89f\") " pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f334a381-fb5b-413d-b975-0e2511378c95-agent-certs\") pod \"konnectivity-agent-729fq\" (UID: \"f334a381-fb5b-413d-b975-0e2511378c95\") " pod="kube-system/konnectivity-agent-729fq" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966955 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-os-release\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.966978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-tuned\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.967000 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5rr8\" (UniqueName: \"kubernetes.io/projected/9501ad15-36cb-45df-9373-b96d6a0e43cf-kube-api-access-l5rr8\") pod \"iptables-alerter-cpgvc\" (UID: \"9501ad15-36cb-45df-9373-b96d6a0e43cf\") " pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.967023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-var-lib-openvswitch\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.967061 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7640279c-4346-4f46-b222-3c04e5d7569e-cni-binary-copy\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.967084 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nxzw\" (UniqueName: \"kubernetes.io/projected/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-kube-api-access-7nxzw\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.967120 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-log-socket\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:45.967560 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:45.967155 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-modprobe-d\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.006870 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.006828 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:56:44 +0000 UTC" deadline="2027-11-10 12:02:28.074128845 +0000 UTC" Apr 21 16:01:46.006870 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.006852 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13628h0m42.067279461s" Apr 21 16:01:46.052454 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.052429 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 16:01:46.054292 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.054235 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal" event={"ID":"ef7a67774dc3cec2ee7c5d8876aafbd3","Type":"ContainerStarted","Data":"a7c7e15643129061e1950a020d701794ff6fb3df61772a0d1b28069207fcb5a5"} Apr 21 16:01:46.055325 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.055297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" event={"ID":"0544071a2bc7a957493cde1dbb18d0ee","Type":"ContainerStarted","Data":"e3eb962cba3d916769be3e0cef1b7b51e29b3b65e213cc6649fa5f2a03fe9863"} Apr 21 16:01:46.067552 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06e52eb9-a2bb-4db2-8e4f-f435db21156c-ovnkube-script-lib\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.067672 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9501ad15-36cb-45df-9373-b96d6a0e43cf-host-slash\") pod \"iptables-alerter-cpgvc\" (UID: \"9501ad15-36cb-45df-9373-b96d6a0e43cf\") " pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:46.067672 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-cni-dir\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.067672 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067609 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-os-release\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.067672 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067633 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-kubelet\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.067672 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-node-log\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-var-lib-kubelet\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067701 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b6959b7-f233-41c1-ba9b-fd52091ee1be-tmp\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-socket-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067755 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-var-lib-cni-multus\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067794 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-run-systemd\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-etc-openvswitch\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067839 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-cni-netd\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067879 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-sysctl-conf\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067900 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-sys\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-etc-selinux\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4701d99b-4f05-4410-bf15-f5fd3e3bf5bf-hosts-file\") pod \"node-resolver-86kzc\" (UID: \"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf\") " pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-system-cni-dir\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067966 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-run-netns\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.068001 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.067998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-sysconfig\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068014 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-sys-fs\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk6dr\" (UniqueName: \"kubernetes.io/projected/4701d99b-4f05-4410-bf15-f5fd3e3bf5bf-kube-api-access-gk6dr\") pod \"node-resolver-86kzc\" (UID: \"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf\") " pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9501ad15-36cb-45df-9373-b96d6a0e43cf-iptables-alerter-script\") pod \"iptables-alerter-cpgvc\" (UID: \"9501ad15-36cb-45df-9373-b96d6a0e43cf\") " pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068079 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-run-openvswitch\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068094 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-cnibin\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068124 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-run-netns\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068139 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ztbh\" (UniqueName: \"kubernetes.io/projected/06e52eb9-a2bb-4db2-8e4f-f435db21156c-kube-api-access-6ztbh\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068193 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59ea63ef-850a-4d2b-867e-080e5b551a72-cni-binary-copy\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068214 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tr945\" (UniqueName: \"kubernetes.io/projected/7640279c-4346-4f46-b222-3c04e5d7569e-kube-api-access-tr945\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-kubernetes\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068228 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/06e52eb9-a2bb-4db2-8e4f-f435db21156c-ovnkube-script-lib\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068257 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-lib-modules\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.068712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-run-k8s-cni-cncf-io\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-systemd-units\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068364 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-system-cni-dir\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-sysctl-d\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-run\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068432 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d74dfe95-7664-4441-8273-cc22ee22d89f-serviceca\") pod \"node-ca-8flks\" (UID: \"d74dfe95-7664-4441-8273-cc22ee22d89f\") " pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068456 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f334a381-fb5b-413d-b975-0e2511378c95-agent-certs\") pod \"konnectivity-agent-729fq\" (UID: \"f334a381-fb5b-413d-b975-0e2511378c95\") " pod="kube-system/konnectivity-agent-729fq" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068503 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068551 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9501ad15-36cb-45df-9373-b96d6a0e43cf-host-slash\") pod \"iptables-alerter-cpgvc\" (UID: \"9501ad15-36cb-45df-9373-b96d6a0e43cf\") " pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-var-lib-kubelet\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068620 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-run-systemd\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068644 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-systemd-units\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068729 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-etc-openvswitch\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068753 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068807 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-os-release\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-cni-netd\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-cni-dir\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-node-log\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.069520 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-os-release\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068940 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-sysctl-d\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-sysctl-conf\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069345 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-sys\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.068906 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-run-k8s-cni-cncf-io\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-system-cni-dir\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069462 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-kubelet\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-run-ovn-kubernetes\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069522 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-run\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069555 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-var-lib-cni-multus\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069567 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-lib-modules\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-system-cni-dir\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-run-netns\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069643 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-socket-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-etc-selinux\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-sys-fs\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-cnibin\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-tuned\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.070264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069722 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-kubernetes\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069730 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5rr8\" (UniqueName: \"kubernetes.io/projected/9501ad15-36cb-45df-9373-b96d6a0e43cf-kube-api-access-l5rr8\") pod \"iptables-alerter-cpgvc\" (UID: \"9501ad15-36cb-45df-9373-b96d6a0e43cf\") " pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069734 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-run-openvswitch\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069739 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-var-lib-openvswitch\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069805 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7640279c-4346-4f46-b222-3c04e5d7569e-os-release\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069818 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-run-netns\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7640279c-4346-4f46-b222-3c04e5d7569e-cni-binary-copy\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069851 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-sysconfig\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069862 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nxzw\" (UniqueName: \"kubernetes.io/projected/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-kube-api-access-7nxzw\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.069895 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069916 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-log-socket\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-modprobe-d\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.069967 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs podName:4c538f88-ee19-454d-9d5b-09dd0a3ed71b nodeName:}" failed. No retries permitted until 2026-04-21 16:01:46.56994099 +0000 UTC m=+3.176843093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs") pod "network-metrics-daemon-qhl4l" (UID: "4c538f88-ee19-454d-9d5b-09dd0a3ed71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9501ad15-36cb-45df-9373-b96d6a0e43cf-iptables-alerter-script\") pod \"iptables-alerter-cpgvc\" (UID: \"9501ad15-36cb-45df-9373-b96d6a0e43cf\") " pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.069999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f334a381-fb5b-413d-b975-0e2511378c95-konnectivity-ca\") pod \"konnectivity-agent-729fq\" (UID: \"f334a381-fb5b-413d-b975-0e2511378c95\") " pod="kube-system/konnectivity-agent-729fq" Apr 21 16:01:46.071133 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070031 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-modprobe-d\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070048 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-var-lib-openvswitch\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070190 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-log-socket\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070228 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jcctt\" (UniqueName: \"kubernetes.io/projected/59ea63ef-850a-4d2b-867e-080e5b551a72-kube-api-access-jcctt\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070322 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59ea63ef-850a-4d2b-867e-080e5b551a72-cni-binary-copy\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070373 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-run-ovn\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-cni-bin\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs864\" (UniqueName: \"kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864\") pod \"network-check-target-8gq94\" (UID: \"92ed2261-b49e-45eb-9081-57d883cfcf5a\") " pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-registration-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-var-lib-kubelet\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070536 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-cnibin\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-hostroot\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06e52eb9-a2bb-4db2-8e4f-f435db21156c-ovnkube-config\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7640279c-4346-4f46-b222-3c04e5d7569e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-host\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070681 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7640279c-4346-4f46-b222-3c04e5d7569e-cni-binary-copy\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgzlg\" (UniqueName: \"kubernetes.io/projected/1b6959b7-f233-41c1-ba9b-fd52091ee1be-kube-api-access-kgzlg\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.071972 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070731 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qvvb\" (UniqueName: \"kubernetes.io/projected/d74dfe95-7664-4441-8273-cc22ee22d89f-kube-api-access-4qvvb\") pod \"node-ca-8flks\" (UID: \"d74dfe95-7664-4441-8273-cc22ee22d89f\") " pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:46.072847 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-socket-dir-parent\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.072847 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-daemon-config\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.072847 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070869 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-hostroot\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.072847 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7640279c-4346-4f46-b222-3c04e5d7569e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.072847 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070912 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqngn\" (UniqueName: \"kubernetes.io/projected/881adee0-d6c5-4372-babe-35f8b57591e6-kube-api-access-cqngn\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.072847 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-cnibin\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.072847 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070945 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-run-ovn\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.072847 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-host\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.072847 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-registration-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.073998 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.070992 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-socket-dir-parent\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.074096 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.071024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d74dfe95-7664-4441-8273-cc22ee22d89f-host\") pod \"node-ca-8flks\" (UID: \"d74dfe95-7664-4441-8273-cc22ee22d89f\") " pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:46.074096 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.071179 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d74dfe95-7664-4441-8273-cc22ee22d89f-host\") pod \"node-ca-8flks\" (UID: \"d74dfe95-7664-4441-8273-cc22ee22d89f\") " pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:46.074096 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.073959 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d74dfe95-7664-4441-8273-cc22ee22d89f-serviceca\") pod \"node-ca-8flks\" (UID: \"d74dfe95-7664-4441-8273-cc22ee22d89f\") " pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:46.074247 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.074149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/06e52eb9-a2bb-4db2-8e4f-f435db21156c-ovnkube-config\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.074389 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.074360 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f334a381-fb5b-413d-b975-0e2511378c95-konnectivity-ca\") pod \"konnectivity-agent-729fq\" (UID: \"f334a381-fb5b-413d-b975-0e2511378c95\") " pod="kube-system/konnectivity-agent-729fq" Apr 21 16:01:46.074449 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.074399 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7640279c-4346-4f46-b222-3c04e5d7569e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.074606 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.074587 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-tuned\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.074964 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-daemon-config\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.074969 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4701d99b-4f05-4410-bf15-f5fd3e3bf5bf-tmp-dir\") pod \"node-resolver-86kzc\" (UID: \"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf\") " pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075033 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-var-lib-kubelet\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06e52eb9-a2bb-4db2-8e4f-f435db21156c-ovn-node-metrics-cert\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075071 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b6959b7-f233-41c1-ba9b-fd52091ee1be-tmp\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075081 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-systemd\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075114 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-device-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-var-lib-cni-bin\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-conf-dir\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-run-multus-certs\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f334a381-fb5b-413d-b975-0e2511378c95-agent-certs\") pod \"konnectivity-agent-729fq\" (UID: \"f334a381-fb5b-413d-b975-0e2511378c95\") " pod="kube-system/konnectivity-agent-729fq" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-etc-kubernetes\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-run-multus-certs\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-slash\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075441 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1b6959b7-f233-41c1-ba9b-fd52091ee1be-etc-systemd\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075471 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06e52eb9-a2bb-4db2-8e4f-f435db21156c-env-overrides\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075518 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/881adee0-d6c5-4372-babe-35f8b57591e6-device-dir\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.076068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075683 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-etc-kubernetes\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.076912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075735 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-slash\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.076912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075930 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/06e52eb9-a2bb-4db2-8e4f-f435db21156c-env-overrides\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.076912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075964 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 16:01:46.076912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075989 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7640279c-4346-4f46-b222-3c04e5d7569e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.076912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.075997 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-multus-conf-dir\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.076912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.076032 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59ea63ef-850a-4d2b-867e-080e5b551a72-host-var-lib-cni-bin\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.076912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.076137 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/06e52eb9-a2bb-4db2-8e4f-f435db21156c-host-cni-bin\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.080299 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.080277 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/06e52eb9-a2bb-4db2-8e4f-f435db21156c-ovn-node-metrics-cert\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.080736 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.080712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nxzw\" (UniqueName: \"kubernetes.io/projected/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-kube-api-access-7nxzw\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:46.082021 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.081999 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:46.082117 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.082029 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:46.082117 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.082043 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gs864 for pod openshift-network-diagnostics/network-check-target-8gq94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:46.082117 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.082104 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864 podName:92ed2261-b49e-45eb-9081-57d883cfcf5a nodeName:}" failed. No retries permitted until 2026-04-21 16:01:46.582087898 +0000 UTC m=+3.188990014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gs864" (UniqueName: "kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864") pod "network-check-target-8gq94" (UID: "92ed2261-b49e-45eb-9081-57d883cfcf5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:46.082642 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.082455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5rr8\" (UniqueName: \"kubernetes.io/projected/9501ad15-36cb-45df-9373-b96d6a0e43cf-kube-api-access-l5rr8\") pod \"iptables-alerter-cpgvc\" (UID: \"9501ad15-36cb-45df-9373-b96d6a0e43cf\") " pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:46.083467 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.083422 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr945\" (UniqueName: \"kubernetes.io/projected/7640279c-4346-4f46-b222-3c04e5d7569e-kube-api-access-tr945\") pod \"multus-additional-cni-plugins-8jszj\" (UID: \"7640279c-4346-4f46-b222-3c04e5d7569e\") " pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.083644 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.083621 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ztbh\" (UniqueName: \"kubernetes.io/projected/06e52eb9-a2bb-4db2-8e4f-f435db21156c-kube-api-access-6ztbh\") pod \"ovnkube-node-6rkhl\" (UID: \"06e52eb9-a2bb-4db2-8e4f-f435db21156c\") " pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.085367 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.085346 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcctt\" (UniqueName: \"kubernetes.io/projected/59ea63ef-850a-4d2b-867e-080e5b551a72-kube-api-access-jcctt\") pod \"multus-4r2h9\" (UID: \"59ea63ef-850a-4d2b-867e-080e5b551a72\") " pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.085506 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.085363 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgzlg\" (UniqueName: \"kubernetes.io/projected/1b6959b7-f233-41c1-ba9b-fd52091ee1be-kube-api-access-kgzlg\") pod \"tuned-r7qx7\" (UID: \"1b6959b7-f233-41c1-ba9b-fd52091ee1be\") " pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.085914 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.085878 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qvvb\" (UniqueName: \"kubernetes.io/projected/d74dfe95-7664-4441-8273-cc22ee22d89f-kube-api-access-4qvvb\") pod \"node-ca-8flks\" (UID: \"d74dfe95-7664-4441-8273-cc22ee22d89f\") " pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:46.086497 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.086460 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqngn\" (UniqueName: \"kubernetes.io/projected/881adee0-d6c5-4372-babe-35f8b57591e6-kube-api-access-cqngn\") pod \"aws-ebs-csi-driver-node-zczqt\" (UID: \"881adee0-d6c5-4372-babe-35f8b57591e6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.176784 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.176694 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4701d99b-4f05-4410-bf15-f5fd3e3bf5bf-hosts-file\") pod \"node-resolver-86kzc\" (UID: \"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf\") " pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:46.176784 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.176738 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk6dr\" (UniqueName: \"kubernetes.io/projected/4701d99b-4f05-4410-bf15-f5fd3e3bf5bf-kube-api-access-gk6dr\") pod \"node-resolver-86kzc\" (UID: \"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf\") " pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:46.176979 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.176821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4701d99b-4f05-4410-bf15-f5fd3e3bf5bf-tmp-dir\") pod \"node-resolver-86kzc\" (UID: \"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf\") " pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:46.176979 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.176834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4701d99b-4f05-4410-bf15-f5fd3e3bf5bf-hosts-file\") pod \"node-resolver-86kzc\" (UID: \"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf\") " pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:46.177168 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.177140 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4701d99b-4f05-4410-bf15-f5fd3e3bf5bf-tmp-dir\") pod \"node-resolver-86kzc\" (UID: \"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf\") " pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:46.185307 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.185280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk6dr\" (UniqueName: \"kubernetes.io/projected/4701d99b-4f05-4410-bf15-f5fd3e3bf5bf-kube-api-access-gk6dr\") pod \"node-resolver-86kzc\" (UID: \"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf\") " pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:46.254330 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.254295 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4r2h9" Apr 21 16:01:46.264087 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.264064 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-cpgvc" Apr 21 16:01:46.273807 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.273770 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:01:46.280584 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.280563 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" Apr 21 16:01:46.287284 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.287263 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8flks" Apr 21 16:01:46.294880 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.294860 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8jszj" Apr 21 16:01:46.302495 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.302468 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-729fq" Apr 21 16:01:46.314162 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.314136 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" Apr 21 16:01:46.321751 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.321729 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-86kzc" Apr 21 16:01:46.579812 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.579706 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:46.579957 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.579908 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:46.580030 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.579988 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs podName:4c538f88-ee19-454d-9d5b-09dd0a3ed71b nodeName:}" failed. No retries permitted until 2026-04-21 16:01:47.57996864 +0000 UTC m=+4.186870743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs") pod "network-metrics-daemon-qhl4l" (UID: "4c538f88-ee19-454d-9d5b-09dd0a3ed71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:46.680307 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:46.680271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs864\" (UniqueName: \"kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864\") pod \"network-check-target-8gq94\" (UID: \"92ed2261-b49e-45eb-9081-57d883cfcf5a\") " pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:46.680476 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.680429 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:46.680476 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.680447 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:46.680476 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.680458 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gs864 for pod openshift-network-diagnostics/network-check-target-8gq94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:46.680637 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:46.680514 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864 podName:92ed2261-b49e-45eb-9081-57d883cfcf5a nodeName:}" failed. No retries permitted until 2026-04-21 16:01:47.680495298 +0000 UTC m=+4.287397396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs864" (UniqueName: "kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864") pod "network-check-target-8gq94" (UID: "92ed2261-b49e-45eb-9081-57d883cfcf5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:46.870886 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:46.870862 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06e52eb9_a2bb_4db2_8e4f_f435db21156c.slice/crio-d093cf389c9a3c82c6a2285bc2b7419595bab1a8ffcc50efe1044cc2d1a7b5df WatchSource:0}: Error finding container d093cf389c9a3c82c6a2285bc2b7419595bab1a8ffcc50efe1044cc2d1a7b5df: Status 404 returned error can't find the container with id d093cf389c9a3c82c6a2285bc2b7419595bab1a8ffcc50efe1044cc2d1a7b5df Apr 21 16:01:46.873164 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:46.873131 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6959b7_f233_41c1_ba9b_fd52091ee1be.slice/crio-68cdf0417dd7a46e67d23c3823861266f452dce87907199b876c237d594d5ddb WatchSource:0}: Error finding container 68cdf0417dd7a46e67d23c3823861266f452dce87907199b876c237d594d5ddb: Status 404 returned error can't find the container with id 68cdf0417dd7a46e67d23c3823861266f452dce87907199b876c237d594d5ddb Apr 21 16:01:46.876403 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:46.876375 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9501ad15_36cb_45df_9373_b96d6a0e43cf.slice/crio-8ebe57a7c1df64806b970b30a53ea5f231beeb9e2c5d6a4d2b9af7fc3fe9910f WatchSource:0}: Error finding container 8ebe57a7c1df64806b970b30a53ea5f231beeb9e2c5d6a4d2b9af7fc3fe9910f: Status 404 returned error can't find the container with id 8ebe57a7c1df64806b970b30a53ea5f231beeb9e2c5d6a4d2b9af7fc3fe9910f Apr 21 16:01:46.877254 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:46.877233 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf334a381_fb5b_413d_b975_0e2511378c95.slice/crio-18df24f71758dfabfaad668304f4e12950667bd9734092ef0446aa892082878b WatchSource:0}: Error finding container 18df24f71758dfabfaad668304f4e12950667bd9734092ef0446aa892082878b: Status 404 returned error can't find the container with id 18df24f71758dfabfaad668304f4e12950667bd9734092ef0446aa892082878b Apr 21 16:01:46.878056 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:46.878037 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4701d99b_4f05_4410_bf15_f5fd3e3bf5bf.slice/crio-e30fe40f88961da64d20a8f20ea42c4fa4f1f7769aeaf2cc5a7f87c126d6650b WatchSource:0}: Error finding container e30fe40f88961da64d20a8f20ea42c4fa4f1f7769aeaf2cc5a7f87c126d6650b: Status 404 returned error can't find the container with id e30fe40f88961da64d20a8f20ea42c4fa4f1f7769aeaf2cc5a7f87c126d6650b Apr 21 16:01:46.880263 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:46.880241 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod881adee0_d6c5_4372_babe_35f8b57591e6.slice/crio-5894c1549147f0c88a5c6103d677f1c85dd65747c5f15e3a09c3f6ef1f515811 WatchSource:0}: Error finding container 5894c1549147f0c88a5c6103d677f1c85dd65747c5f15e3a09c3f6ef1f515811: Status 404 returned error can't find the container with id 5894c1549147f0c88a5c6103d677f1c85dd65747c5f15e3a09c3f6ef1f515811 Apr 21 16:01:46.881340 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:46.881206 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74dfe95_7664_4441_8273_cc22ee22d89f.slice/crio-1aeaee6e5e62cac1157e5f275573f386582cce8f4dbdb31b81cbb4bcde691f03 WatchSource:0}: Error finding container 1aeaee6e5e62cac1157e5f275573f386582cce8f4dbdb31b81cbb4bcde691f03: Status 404 returned error can't find the container with id 1aeaee6e5e62cac1157e5f275573f386582cce8f4dbdb31b81cbb4bcde691f03 Apr 21 16:01:46.882136 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:46.881894 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59ea63ef_850a_4d2b_867e_080e5b551a72.slice/crio-7f493c332b921cc5e50f8408e324c76ffc5a88a2652bd6d950c7289078fc76b0 WatchSource:0}: Error finding container 7f493c332b921cc5e50f8408e324c76ffc5a88a2652bd6d950c7289078fc76b0: Status 404 returned error can't find the container with id 7f493c332b921cc5e50f8408e324c76ffc5a88a2652bd6d950c7289078fc76b0 Apr 21 16:01:46.884561 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:01:46.883816 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7640279c_4346_4f46_b222_3c04e5d7569e.slice/crio-1f7cf6ba0455cc0520a072e51905a430bc3304b5434e13468e0e77d494fcdc55 WatchSource:0}: Error finding container 1f7cf6ba0455cc0520a072e51905a430bc3304b5434e13468e0e77d494fcdc55: Status 404 returned error can't find the container with id 1f7cf6ba0455cc0520a072e51905a430bc3304b5434e13468e0e77d494fcdc55 Apr 21 16:01:47.007324 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.007150 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 15:56:44 +0000 UTC" deadline="2027-12-21 22:34:13.139352835 +0000 UTC" Apr 21 16:01:47.007324 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.007320 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14622h32m26.132037071s" Apr 21 16:01:47.057616 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.057579 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jszj" event={"ID":"7640279c-4346-4f46-b222-3c04e5d7569e","Type":"ContainerStarted","Data":"1f7cf6ba0455cc0520a072e51905a430bc3304b5434e13468e0e77d494fcdc55"} Apr 21 16:01:47.058488 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.058468 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4r2h9" event={"ID":"59ea63ef-850a-4d2b-867e-080e5b551a72","Type":"ContainerStarted","Data":"7f493c332b921cc5e50f8408e324c76ffc5a88a2652bd6d950c7289078fc76b0"} Apr 21 16:01:47.059338 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.059320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" event={"ID":"881adee0-d6c5-4372-babe-35f8b57591e6","Type":"ContainerStarted","Data":"5894c1549147f0c88a5c6103d677f1c85dd65747c5f15e3a09c3f6ef1f515811"} Apr 21 16:01:47.060172 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.060154 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-86kzc" event={"ID":"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf","Type":"ContainerStarted","Data":"e30fe40f88961da64d20a8f20ea42c4fa4f1f7769aeaf2cc5a7f87c126d6650b"} Apr 21 16:01:47.061171 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.061152 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-729fq" event={"ID":"f334a381-fb5b-413d-b975-0e2511378c95","Type":"ContainerStarted","Data":"18df24f71758dfabfaad668304f4e12950667bd9734092ef0446aa892082878b"} Apr 21 16:01:47.062042 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.062025 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" event={"ID":"1b6959b7-f233-41c1-ba9b-fd52091ee1be","Type":"ContainerStarted","Data":"68cdf0417dd7a46e67d23c3823861266f452dce87907199b876c237d594d5ddb"} Apr 21 16:01:47.062983 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.062960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerStarted","Data":"d093cf389c9a3c82c6a2285bc2b7419595bab1a8ffcc50efe1044cc2d1a7b5df"} Apr 21 16:01:47.064424 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.064400 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal" event={"ID":"ef7a67774dc3cec2ee7c5d8876aafbd3","Type":"ContainerStarted","Data":"5cedff4f48c4fa680042c81fef7c0297f55f556b61c943103480e17bb412f5fc"} Apr 21 16:01:47.065379 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.065355 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8flks" event={"ID":"d74dfe95-7664-4441-8273-cc22ee22d89f","Type":"ContainerStarted","Data":"1aeaee6e5e62cac1157e5f275573f386582cce8f4dbdb31b81cbb4bcde691f03"} Apr 21 16:01:47.066314 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.066297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cpgvc" event={"ID":"9501ad15-36cb-45df-9373-b96d6a0e43cf","Type":"ContainerStarted","Data":"8ebe57a7c1df64806b970b30a53ea5f231beeb9e2c5d6a4d2b9af7fc3fe9910f"} Apr 21 16:01:47.586656 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.586022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:47.586656 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:47.586210 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:47.586656 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:47.586279 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs podName:4c538f88-ee19-454d-9d5b-09dd0a3ed71b nodeName:}" failed. No retries permitted until 2026-04-21 16:01:49.586258194 +0000 UTC m=+6.193160280 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs") pod "network-metrics-daemon-qhl4l" (UID: "4c538f88-ee19-454d-9d5b-09dd0a3ed71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:47.686622 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:47.686583 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs864\" (UniqueName: \"kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864\") pod \"network-check-target-8gq94\" (UID: \"92ed2261-b49e-45eb-9081-57d883cfcf5a\") " pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:47.686825 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:47.686797 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:47.686825 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:47.686818 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:47.686932 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:47.686830 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gs864 for pod openshift-network-diagnostics/network-check-target-8gq94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:47.686932 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:47.686894 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864 podName:92ed2261-b49e-45eb-9081-57d883cfcf5a nodeName:}" failed. No retries permitted until 2026-04-21 16:01:49.68687331 +0000 UTC m=+6.293775414 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs864" (UniqueName: "kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864") pod "network-check-target-8gq94" (UID: "92ed2261-b49e-45eb-9081-57d883cfcf5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:48.052823 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:48.052792 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:48.053353 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:48.052919 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:01:48.053502 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:48.053477 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:48.053650 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:48.053629 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:01:48.092313 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:48.092281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" event={"ID":"0544071a2bc7a957493cde1dbb18d0ee","Type":"ContainerStarted","Data":"b3c1fa4b983668590ff5b8c129a8cb20a73b5b11263a509691e4ab072f49c97d"} Apr 21 16:01:48.107808 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:48.107670 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-96.ec2.internal" podStartSLOduration=3.107651805 podStartE2EDuration="3.107651805s" podCreationTimestamp="2026-04-21 16:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:01:47.083533431 +0000 UTC m=+3.690435538" watchObservedRunningTime="2026-04-21 16:01:48.107651805 +0000 UTC m=+4.714553913" Apr 21 16:01:49.104296 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.104160 2577 generic.go:358] "Generic (PLEG): container finished" podID="0544071a2bc7a957493cde1dbb18d0ee" containerID="b3c1fa4b983668590ff5b8c129a8cb20a73b5b11263a509691e4ab072f49c97d" exitCode=0 Apr 21 16:01:49.104296 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.104258 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" event={"ID":"0544071a2bc7a957493cde1dbb18d0ee","Type":"ContainerDied","Data":"b3c1fa4b983668590ff5b8c129a8cb20a73b5b11263a509691e4ab072f49c97d"} Apr 21 16:01:49.165332 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.165298 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fmhv9"] Apr 21 16:01:49.168334 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.168311 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:49.168467 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.168392 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:01:49.200567 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.200536 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/27182b2f-30c8-4cef-90b0-b91b4f04047a-kubelet-config\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:49.200744 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.200594 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/27182b2f-30c8-4cef-90b0-b91b4f04047a-dbus\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:49.200744 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.200626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:49.301360 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.301021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/27182b2f-30c8-4cef-90b0-b91b4f04047a-kubelet-config\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:49.301360 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.301078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/27182b2f-30c8-4cef-90b0-b91b4f04047a-dbus\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:49.301360 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.301109 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:49.301360 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.301142 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/27182b2f-30c8-4cef-90b0-b91b4f04047a-kubelet-config\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:49.301360 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.301249 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:49.301360 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.301309 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret podName:27182b2f-30c8-4cef-90b0-b91b4f04047a nodeName:}" failed. No retries permitted until 2026-04-21 16:01:49.801291721 +0000 UTC m=+6.408193813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret") pod "global-pull-secret-syncer-fmhv9" (UID: "27182b2f-30c8-4cef-90b0-b91b4f04047a") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:49.301360 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.301305 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/27182b2f-30c8-4cef-90b0-b91b4f04047a-dbus\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:49.604695 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.604663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:49.604893 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.604825 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:49.604958 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.604903 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs podName:4c538f88-ee19-454d-9d5b-09dd0a3ed71b nodeName:}" failed. No retries permitted until 2026-04-21 16:01:53.604882687 +0000 UTC m=+10.211784778 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs") pod "network-metrics-daemon-qhl4l" (UID: "4c538f88-ee19-454d-9d5b-09dd0a3ed71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:49.705971 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.705933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs864\" (UniqueName: \"kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864\") pod \"network-check-target-8gq94\" (UID: \"92ed2261-b49e-45eb-9081-57d883cfcf5a\") " pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:49.706150 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.706115 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:49.706150 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.706133 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:49.706150 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.706146 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gs864 for pod openshift-network-diagnostics/network-check-target-8gq94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:49.706273 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.706209 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864 podName:92ed2261-b49e-45eb-9081-57d883cfcf5a nodeName:}" failed. No retries permitted until 2026-04-21 16:01:53.706190528 +0000 UTC m=+10.313092629 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs864" (UniqueName: "kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864") pod "network-check-target-8gq94" (UID: "92ed2261-b49e-45eb-9081-57d883cfcf5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:49.807448 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:49.806853 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:49.807448 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.807004 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:49.807448 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:49.807070 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret podName:27182b2f-30c8-4cef-90b0-b91b4f04047a nodeName:}" failed. No retries permitted until 2026-04-21 16:01:50.807051451 +0000 UTC m=+7.413953607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret") pod "global-pull-secret-syncer-fmhv9" (UID: "27182b2f-30c8-4cef-90b0-b91b4f04047a") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:50.051091 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:50.050918 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:50.051091 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:50.051060 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:01:50.051526 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:50.051387 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:50.051526 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:50.051492 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:01:50.815201 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:50.815075 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:50.815637 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:50.815220 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:50.815637 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:50.815296 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret podName:27182b2f-30c8-4cef-90b0-b91b4f04047a nodeName:}" failed. No retries permitted until 2026-04-21 16:01:52.815274316 +0000 UTC m=+9.422176416 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret") pod "global-pull-secret-syncer-fmhv9" (UID: "27182b2f-30c8-4cef-90b0-b91b4f04047a") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:51.050304 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:51.050267 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:51.050466 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:51.050410 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:01:52.050676 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:52.050590 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:52.050676 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:52.050628 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:52.051310 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:52.050727 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:01:52.051310 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:52.050841 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:01:52.832491 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:52.832405 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:52.832682 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:52.832561 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:52.832682 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:52.832642 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret podName:27182b2f-30c8-4cef-90b0-b91b4f04047a nodeName:}" failed. No retries permitted until 2026-04-21 16:01:56.832621807 +0000 UTC m=+13.439523906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret") pod "global-pull-secret-syncer-fmhv9" (UID: "27182b2f-30c8-4cef-90b0-b91b4f04047a") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:53.050383 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:53.050032 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:53.050383 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:53.050181 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:01:53.638693 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:53.638564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:53.639133 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:53.638739 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:53.639133 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:53.638821 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs podName:4c538f88-ee19-454d-9d5b-09dd0a3ed71b nodeName:}" failed. No retries permitted until 2026-04-21 16:02:01.638800595 +0000 UTC m=+18.245702683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs") pod "network-metrics-daemon-qhl4l" (UID: "4c538f88-ee19-454d-9d5b-09dd0a3ed71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:01:53.740099 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:53.739959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs864\" (UniqueName: \"kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864\") pod \"network-check-target-8gq94\" (UID: \"92ed2261-b49e-45eb-9081-57d883cfcf5a\") " pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:53.740273 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:53.740143 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:01:53.740273 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:53.740161 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:01:53.740273 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:53.740172 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gs864 for pod openshift-network-diagnostics/network-check-target-8gq94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:53.740273 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:53.740233 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864 podName:92ed2261-b49e-45eb-9081-57d883cfcf5a nodeName:}" failed. No retries permitted until 2026-04-21 16:02:01.740213558 +0000 UTC m=+18.347115651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs864" (UniqueName: "kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864") pod "network-check-target-8gq94" (UID: "92ed2261-b49e-45eb-9081-57d883cfcf5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:01:54.051966 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:54.051646 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:54.051966 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:54.051762 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:01:54.052199 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:54.052002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:54.052199 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:54.052145 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:01:55.049920 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:55.049875 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:55.050402 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:55.050007 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:01:56.050377 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:56.050157 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:56.050377 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:56.050175 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:56.050377 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:56.050281 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:01:56.050915 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:56.050415 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:01:56.865675 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:56.865636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:56.865890 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:56.865802 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:56.865890 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:56.865862 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret podName:27182b2f-30c8-4cef-90b0-b91b4f04047a nodeName:}" failed. No retries permitted until 2026-04-21 16:02:04.865844637 +0000 UTC m=+21.472746748 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret") pod "global-pull-secret-syncer-fmhv9" (UID: "27182b2f-30c8-4cef-90b0-b91b4f04047a") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:01:57.049660 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:57.049631 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:57.049843 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:57.049729 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:01:58.052368 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:58.052335 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:01:58.052834 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:58.052335 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:01:58.052834 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:58.052443 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:01:58.052834 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:58.052510 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:01:59.049989 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:01:59.049963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:01:59.050098 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:01:59.050065 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:02:00.050034 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:00.049994 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:00.050493 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:00.050123 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:02:00.050493 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:00.050377 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:00.050603 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:00.050531 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:02:01.050233 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:01.050200 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:01.050588 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:01.050303 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:02:01.701441 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:01.701404 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:01.701610 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:01.701564 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:01.701680 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:01.701632 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs podName:4c538f88-ee19-454d-9d5b-09dd0a3ed71b nodeName:}" failed. No retries permitted until 2026-04-21 16:02:17.701616131 +0000 UTC m=+34.308518221 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs") pod "network-metrics-daemon-qhl4l" (UID: "4c538f88-ee19-454d-9d5b-09dd0a3ed71b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 16:02:01.802307 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:01.802270 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs864\" (UniqueName: \"kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864\") pod \"network-check-target-8gq94\" (UID: \"92ed2261-b49e-45eb-9081-57d883cfcf5a\") " pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:01.802475 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:01.802423 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 16:02:01.802475 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:01.802441 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 16:02:01.802475 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:01.802452 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gs864 for pod openshift-network-diagnostics/network-check-target-8gq94: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:01.802608 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:01.802516 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864 podName:92ed2261-b49e-45eb-9081-57d883cfcf5a nodeName:}" failed. No retries permitted until 2026-04-21 16:02:17.802495661 +0000 UTC m=+34.409397766 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-gs864" (UniqueName: "kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864") pod "network-check-target-8gq94" (UID: "92ed2261-b49e-45eb-9081-57d883cfcf5a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 16:02:02.052737 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:02.052651 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:02.052737 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:02.052697 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:02.053170 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:02.052787 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:02:02.053170 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:02.052869 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:02:03.049937 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:03.049906 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:03.050123 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:03.050024 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:02:04.051397 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.051208 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:04.051994 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.051276 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:04.051994 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:04.051470 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:02:04.051994 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:04.051559 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:02:04.128791 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.128750 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" event={"ID":"1b6959b7-f233-41c1-ba9b-fd52091ee1be","Type":"ContainerStarted","Data":"ee6139c107ba0e1f0c91e44089c32776995deff438a85eda5c4673b76ebdb14c"} Apr 21 16:02:04.130308 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.130288 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerStarted","Data":"37a397b1585eb0b30a2d0dfdb151c47d5d350ae4ea77382a2e0b95d1f23c3b9e"} Apr 21 16:02:04.130395 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.130315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerStarted","Data":"0d22549cd0deafb853c9a909371d0b2ab99fa3f751c2e3cf1c941a6320ccc2cb"} Apr 21 16:02:04.131601 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.131576 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8flks" event={"ID":"d74dfe95-7664-4441-8273-cc22ee22d89f","Type":"ContainerStarted","Data":"e34a692fd1e43603a459036f4ff2c35dc5d81394712b70171071484c7d83f7ea"} Apr 21 16:02:04.136203 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.136148 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" event={"ID":"0544071a2bc7a957493cde1dbb18d0ee","Type":"ContainerStarted","Data":"74c4cd5badefb58bd2ce31c276bc1907dec7387e6e227577307da2442f93c126"} Apr 21 16:02:04.138204 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.138183 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jszj" event={"ID":"7640279c-4346-4f46-b222-3c04e5d7569e","Type":"ContainerStarted","Data":"b56115814c03ff35bff347305ebac832db75bba9c4f8d3e92a8810a643bbd675"} Apr 21 16:02:04.139460 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.139440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4r2h9" event={"ID":"59ea63ef-850a-4d2b-867e-080e5b551a72","Type":"ContainerStarted","Data":"6e5ae9fc67463ddda8b3f6062906865c547b64c2d9913b7fcaab742db5f43dd0"} Apr 21 16:02:04.140614 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.140596 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" event={"ID":"881adee0-d6c5-4372-babe-35f8b57591e6","Type":"ContainerStarted","Data":"35ec6e3cedb45d5c5d457079f780d5c1c8c6b32b8d1efaf11f514beb1f416d88"} Apr 21 16:02:04.141693 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.141670 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-86kzc" event={"ID":"4701d99b-4f05-4410-bf15-f5fd3e3bf5bf","Type":"ContainerStarted","Data":"369a9aa5d928d391c1c1a3aec4340ad569e6055b9f0ee02ed9b17d90f768e15f"} Apr 21 16:02:04.142747 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.142729 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-729fq" event={"ID":"f334a381-fb5b-413d-b975-0e2511378c95","Type":"ContainerStarted","Data":"af711f32a585808d49c7e31def308ce0b1b43ab7d152164829e78e88ed88bae3"} Apr 21 16:02:04.163809 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.163631 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8flks" podStartSLOduration=8.024320217 podStartE2EDuration="20.163613823s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:01:46.885954366 +0000 UTC m=+3.492856453" lastFinishedPulling="2026-04-21 16:01:59.025247971 +0000 UTC m=+15.632150059" observedRunningTime="2026-04-21 16:02:04.163278821 +0000 UTC m=+20.770180928" watchObservedRunningTime="2026-04-21 16:02:04.163613823 +0000 UTC m=+20.770515931" Apr 21 16:02:04.164136 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.164063 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-r7qx7" podStartSLOduration=3.354161787 podStartE2EDuration="20.164052811s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:01:46.875284377 +0000 UTC m=+3.482186485" lastFinishedPulling="2026-04-21 16:02:03.68517541 +0000 UTC m=+20.292077509" observedRunningTime="2026-04-21 16:02:04.14936037 +0000 UTC m=+20.756262476" watchObservedRunningTime="2026-04-21 16:02:04.164052811 +0000 UTC m=+20.770954961" Apr 21 16:02:04.218578 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.218490 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-86kzc" podStartSLOduration=8.072804454 podStartE2EDuration="20.218471255s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:01:46.87975927 +0000 UTC m=+3.486661368" lastFinishedPulling="2026-04-21 16:01:59.02542607 +0000 UTC m=+15.632328169" observedRunningTime="2026-04-21 16:02:04.217751776 +0000 UTC m=+20.824653887" watchObservedRunningTime="2026-04-21 16:02:04.218471255 +0000 UTC m=+20.825373364" Apr 21 16:02:04.240821 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.240758 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-96.ec2.internal" podStartSLOduration=19.240745498 podStartE2EDuration="19.240745498s" podCreationTimestamp="2026-04-21 16:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:02:04.240423841 +0000 UTC m=+20.847325948" watchObservedRunningTime="2026-04-21 16:02:04.240745498 +0000 UTC m=+20.847647606" Apr 21 16:02:04.278690 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.278547 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4r2h9" podStartSLOduration=3.438377451 podStartE2EDuration="20.278532363s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:01:46.884861039 +0000 UTC m=+3.491763139" lastFinishedPulling="2026-04-21 16:02:03.725015963 +0000 UTC m=+20.331918051" observedRunningTime="2026-04-21 16:02:04.262035886 +0000 UTC m=+20.868937992" watchObservedRunningTime="2026-04-21 16:02:04.278532363 +0000 UTC m=+20.885434520" Apr 21 16:02:04.278897 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.278874 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-729fq" podStartSLOduration=3.808398898 podStartE2EDuration="20.278867111s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:01:46.879068017 +0000 UTC m=+3.485970116" lastFinishedPulling="2026-04-21 16:02:03.349536242 +0000 UTC m=+19.956438329" observedRunningTime="2026-04-21 16:02:04.278414772 +0000 UTC m=+20.885316878" watchObservedRunningTime="2026-04-21 16:02:04.278867111 +0000 UTC m=+20.885769225" Apr 21 16:02:04.630728 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.630690 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-729fq" Apr 21 16:02:04.631846 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.631820 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-729fq" Apr 21 16:02:04.923927 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:04.923887 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:04.924085 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:04.924036 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 16:02:04.924124 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:04.924111 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret podName:27182b2f-30c8-4cef-90b0-b91b4f04047a nodeName:}" failed. No retries permitted until 2026-04-21 16:02:20.92408441 +0000 UTC m=+37.530986496 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret") pod "global-pull-secret-syncer-fmhv9" (UID: "27182b2f-30c8-4cef-90b0-b91b4f04047a") : object "kube-system"/"original-pull-secret" not registered Apr 21 16:02:05.049832 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.049746 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:05.049964 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:05.049873 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:02:05.145371 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.145338 2577 generic.go:358] "Generic (PLEG): container finished" podID="7640279c-4346-4f46-b222-3c04e5d7569e" containerID="b56115814c03ff35bff347305ebac832db75bba9c4f8d3e92a8810a643bbd675" exitCode=0 Apr 21 16:02:05.145929 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.145381 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jszj" event={"ID":"7640279c-4346-4f46-b222-3c04e5d7569e","Type":"ContainerDied","Data":"b56115814c03ff35bff347305ebac832db75bba9c4f8d3e92a8810a643bbd675"} Apr 21 16:02:05.149970 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.149952 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:02:05.150264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.150244 2577 generic.go:358] "Generic (PLEG): container finished" podID="06e52eb9-a2bb-4db2-8e4f-f435db21156c" containerID="37a397b1585eb0b30a2d0dfdb151c47d5d350ae4ea77382a2e0b95d1f23c3b9e" exitCode=1 Apr 21 16:02:05.150346 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.150281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerDied","Data":"37a397b1585eb0b30a2d0dfdb151c47d5d350ae4ea77382a2e0b95d1f23c3b9e"} Apr 21 16:02:05.150346 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.150316 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerStarted","Data":"e5fd44a9e61f189f06c2c829bbc16705e39c12276afdb52b7c27bd70bd7ad6c1"} Apr 21 16:02:05.150346 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.150332 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerStarted","Data":"85448aedf81bfadac10e9c7fb0a1bcb9b1dc63093e50544159cf5c53dd48db14"} Apr 21 16:02:05.150346 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.150344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerStarted","Data":"cd89943bb49da07933059b098de7afe513385335fd9c679713fa21cbe4488fb7"} Apr 21 16:02:05.150505 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.150357 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerStarted","Data":"cd0efa690c9259c0d24e50e624764775727f13e35a68c4a5b4b7e0fbec717d2d"} Apr 21 16:02:05.150600 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.150582 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-729fq" Apr 21 16:02:05.151289 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.151271 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-729fq" Apr 21 16:02:05.438929 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:05.438898 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 16:02:06.031890 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:06.031763 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T16:02:05.438918774Z","UUID":"45c90f64-5d20-46b9-bb1c-c27ec84368eb","Handler":null,"Name":"","Endpoint":""} Apr 21 16:02:06.033800 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:06.033766 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 16:02:06.033922 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:06.033811 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 16:02:06.052981 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:06.052951 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:06.052981 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:06.052975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:06.053162 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:06.053086 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:02:06.053216 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:06.053175 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:02:06.155313 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:06.155276 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" event={"ID":"881adee0-d6c5-4372-babe-35f8b57591e6","Type":"ContainerStarted","Data":"b67265d718fef8b680425fc85441242a2e03fa26428402b960b16c8fa5f6d977"} Apr 21 16:02:06.156693 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:06.156650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-cpgvc" event={"ID":"9501ad15-36cb-45df-9373-b96d6a0e43cf","Type":"ContainerStarted","Data":"3d33b6a76737dd44f08b417134760f84acefab8a388c6df18fdb24b3cbfd1d1b"} Apr 21 16:02:06.194564 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:06.194501 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-cpgvc" podStartSLOduration=5.387904634 podStartE2EDuration="22.194469922s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:01:46.878269128 +0000 UTC m=+3.485171224" lastFinishedPulling="2026-04-21 16:02:03.684834414 +0000 UTC m=+20.291736512" observedRunningTime="2026-04-21 16:02:06.194178862 +0000 UTC m=+22.801080970" watchObservedRunningTime="2026-04-21 16:02:06.194469922 +0000 UTC m=+22.801372032" Apr 21 16:02:07.050424 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:07.050182 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:07.050624 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:07.050529 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:02:07.161401 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:07.161356 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" event={"ID":"881adee0-d6c5-4372-babe-35f8b57591e6","Type":"ContainerStarted","Data":"109b28f7f5126522f31255577835ed0039283230370035f94007a90624b6daac"} Apr 21 16:02:07.164723 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:07.164702 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:02:07.165079 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:07.165050 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerStarted","Data":"56c50551020b847d3f418a79b4c9f08a5736b612f2f0a0d907cd02056aca6f4e"} Apr 21 16:02:07.181547 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:07.181507 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zczqt" podStartSLOduration=3.55660878 podStartE2EDuration="23.181495584s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:01:46.884272567 +0000 UTC m=+3.491174653" lastFinishedPulling="2026-04-21 16:02:06.509159372 +0000 UTC m=+23.116061457" observedRunningTime="2026-04-21 16:02:07.18144821 +0000 UTC m=+23.788350317" watchObservedRunningTime="2026-04-21 16:02:07.181495584 +0000 UTC m=+23.788397691" Apr 21 16:02:08.050081 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:08.050039 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:08.050298 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:08.050235 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:02:08.050298 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:08.050279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:08.050418 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:08.050401 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:02:09.049811 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:09.049726 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:09.050436 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:09.049845 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:02:10.050483 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:10.050446 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:10.051106 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:10.050446 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:10.051106 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:10.050592 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:02:10.051106 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:10.050640 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:02:10.170767 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:10.170728 2577 generic.go:358] "Generic (PLEG): container finished" podID="7640279c-4346-4f46-b222-3c04e5d7569e" containerID="8a72fa86824f75e52c6b8b899bfde263d2b34403ab662028a0c58cb2463e6608" exitCode=0 Apr 21 16:02:10.170949 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:10.170809 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jszj" event={"ID":"7640279c-4346-4f46-b222-3c04e5d7569e","Type":"ContainerDied","Data":"8a72fa86824f75e52c6b8b899bfde263d2b34403ab662028a0c58cb2463e6608"} Apr 21 16:02:10.173940 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:10.173923 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:02:10.174256 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:10.174229 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerStarted","Data":"9f545e26358cda65f3d75d0a9140a1aab8877760226917e7326730c949d68ebd"} Apr 21 16:02:10.174523 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:10.174508 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:02:10.174588 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:10.174535 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:02:10.174667 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:10.174655 2577 scope.go:117] "RemoveContainer" containerID="37a397b1585eb0b30a2d0dfdb151c47d5d350ae4ea77382a2e0b95d1f23c3b9e" Apr 21 16:02:10.190231 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:10.190207 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:02:11.050492 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.050452 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:11.050923 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:11.050567 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:02:11.178995 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.178912 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:02:11.179333 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.179307 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" event={"ID":"06e52eb9-a2bb-4db2-8e4f-f435db21156c","Type":"ContainerStarted","Data":"de1f218dc7d1bafbe5362762394e3a3ade0563546cf7508ef56315ba0a47164f"} Apr 21 16:02:11.179510 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.179490 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:02:11.181228 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.181205 2577 generic.go:358] "Generic (PLEG): container finished" podID="7640279c-4346-4f46-b222-3c04e5d7569e" containerID="c7b9d527d5b8441bf978aea348884e5dbef4250df08ecabcfbc01fd4f7a98110" exitCode=0 Apr 21 16:02:11.181306 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.181235 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jszj" event={"ID":"7640279c-4346-4f46-b222-3c04e5d7569e","Type":"ContainerDied","Data":"c7b9d527d5b8441bf978aea348884e5dbef4250df08ecabcfbc01fd4f7a98110"} Apr 21 16:02:11.194117 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.194094 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:02:11.216668 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.216627 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" podStartSLOduration=10.311225796 podStartE2EDuration="27.216612875s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:01:46.872620766 +0000 UTC m=+3.479522853" lastFinishedPulling="2026-04-21 16:02:03.778007846 +0000 UTC m=+20.384909932" observedRunningTime="2026-04-21 16:02:11.215083572 +0000 UTC m=+27.821985678" watchObservedRunningTime="2026-04-21 16:02:11.216612875 +0000 UTC m=+27.823514981" Apr 21 16:02:11.247280 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.247253 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fmhv9"] Apr 21 16:02:11.247438 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.247336 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:11.247438 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:11.247416 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:02:11.249211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.249185 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8gq94"] Apr 21 16:02:11.249339 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.249295 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:11.249413 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:11.249393 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:02:11.250110 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.250091 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qhl4l"] Apr 21 16:02:11.250207 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:11.250194 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:11.250281 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:11.250269 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:02:12.185436 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:12.185404 2577 generic.go:358] "Generic (PLEG): container finished" podID="7640279c-4346-4f46-b222-3c04e5d7569e" containerID="b92fdd58caaa92b0452db2247a6de85ef037a9691084138e17c74777655c5366" exitCode=0 Apr 21 16:02:12.185870 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:12.185488 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jszj" event={"ID":"7640279c-4346-4f46-b222-3c04e5d7569e","Type":"ContainerDied","Data":"b92fdd58caaa92b0452db2247a6de85ef037a9691084138e17c74777655c5366"} Apr 21 16:02:13.050455 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:13.050422 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:13.050606 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:13.050461 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:13.050606 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:13.050488 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:13.050606 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:13.050580 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:02:13.050765 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:13.050669 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:02:13.050765 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:13.050709 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:02:15.049535 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.049499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:15.050060 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.049499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:15.050060 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.049499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:15.050060 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:15.049738 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8gq94" podUID="92ed2261-b49e-45eb-9081-57d883cfcf5a" Apr 21 16:02:15.050060 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:15.049631 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:02:15.050060 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:15.049817 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmhv9" podUID="27182b2f-30c8-4cef-90b0-b91b4f04047a" Apr 21 16:02:15.638025 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.637994 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-96.ec2.internal" event="NodeReady" Apr 21 16:02:15.638188 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.638134 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 16:02:15.681831 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.681795 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs"] Apr 21 16:02:15.691394 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.691364 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4"] Apr 21 16:02:15.691564 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.691543 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.694486 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.694403 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 16:02:15.694486 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.694449 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 16:02:15.694486 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.694470 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 16:02:15.694486 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.694480 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 16:02:15.694750 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.694480 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 16:02:15.694750 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.694409 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 16:02:15.694750 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.694573 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 16:02:15.698434 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.698410 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb"] Apr 21 16:02:15.698846 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.698670 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:15.704252 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.701625 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 16:02:15.704252 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.703016 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-685b9dc9fd-l8klc"] Apr 21 16:02:15.704252 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.703469 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" Apr 21 16:02:15.706666 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.706485 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 16:02:15.706912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.706891 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-b29bn\"" Apr 21 16:02:15.708388 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.708368 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs"] Apr 21 16:02:15.708484 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.708393 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4"] Apr 21 16:02:15.708484 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.708404 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-685b9dc9fd-l8klc"] Apr 21 16:02:15.708484 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.708416 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb"] Apr 21 16:02:15.708647 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.708514 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.709880 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.709860 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bqkbv"] Apr 21 16:02:15.711116 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.711089 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 16:02:15.711452 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.711316 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 16:02:15.711452 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.711324 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 16:02:15.711452 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.711400 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vqm4s\"" Apr 21 16:02:15.716715 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.716346 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:15.719806 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.719767 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 16:02:15.720037 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.720007 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 16:02:15.722543 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.722522 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bqkbv"] Apr 21 16:02:15.724046 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.724026 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 16:02:15.725344 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.724819 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 16:02:15.725532 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.725513 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76wnm\"" Apr 21 16:02:15.802676 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.802647 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mp77f"] Apr 21 16:02:15.809371 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809341 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.809371 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809374 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-bound-sa-token\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.809558 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809394 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/02e5400d-5c59-42f1-b5a9-08b5a2f449fe-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b946f688f-rqvbb\" (UID: \"02e5400d-5c59-42f1-b5a9-08b5a2f449fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" Apr 21 16:02:15.809558 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809414 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppp89\" (UniqueName: \"kubernetes.io/projected/02e5400d-5c59-42f1-b5a9-08b5a2f449fe-kube-api-access-ppp89\") pod \"managed-serviceaccount-addon-agent-6b946f688f-rqvbb\" (UID: \"02e5400d-5c59-42f1-b5a9-08b5a2f449fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" Apr 21 16:02:15.809558 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809443 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnzs\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-kube-api-access-ttnzs\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.809558 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17562fa5-f0c9-4d56-b775-49c5c552b472-tmp\") pod \"klusterlet-addon-workmgr-cb4bb4dbb-tzdj4\" (UID: \"17562fa5-f0c9-4d56-b775-49c5c552b472\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:15.809558 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809530 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/17562fa5-f0c9-4d56-b775-49c5c552b472-klusterlet-config\") pod \"klusterlet-addon-workmgr-cb4bb4dbb-tzdj4\" (UID: \"17562fa5-f0c9-4d56-b775-49c5c552b472\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:15.809760 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809587 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nptzd\" (UniqueName: \"kubernetes.io/projected/17562fa5-f0c9-4d56-b775-49c5c552b472-kube-api-access-nptzd\") pod \"klusterlet-addon-workmgr-cb4bb4dbb-tzdj4\" (UID: \"17562fa5-f0c9-4d56-b775-49c5c552b472\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:15.809760 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809616 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.809760 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809639 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:15.809760 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809657 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-ca\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.809760 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809701 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.809760 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809744 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn5vh\" (UniqueName: \"kubernetes.io/projected/67e3a2b6-e556-409b-98e0-59a2af74476f-kube-api-access-wn5vh\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.810009 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pswwr\" (UniqueName: \"kubernetes.io/projected/89cebd20-5145-4196-8844-826bc1ec6662-kube-api-access-pswwr\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:15.810009 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809860 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-hub\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.810009 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809880 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/67e3a2b6-e556-409b-98e0-59a2af74476f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.810009 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.809981 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-image-registry-private-configuration\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.810182 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.810016 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9517991-a0f5-431a-9763-7400c577640c-ca-trust-extracted\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.810182 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.810040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-registry-certificates\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.810182 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.810063 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-trusted-ca\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.810182 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.810099 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-installation-pull-secrets\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.813005 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.812987 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:15.816826 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.816800 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2l66p\"" Apr 21 16:02:15.817356 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.817342 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 16:02:15.817984 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.817964 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mp77f"] Apr 21 16:02:15.818241 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.818194 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 16:02:15.910733 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.910652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-image-registry-private-configuration\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.910733 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.910692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9517991-a0f5-431a-9763-7400c577640c-ca-trust-extracted\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.910949 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.910830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-registry-certificates\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.910949 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.910873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-trusted-ca\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-installation-pull-secrets\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9517991-a0f5-431a-9763-7400c577640c-ca-trust-extracted\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911203 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-bound-sa-token\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911227 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/02e5400d-5c59-42f1-b5a9-08b5a2f449fe-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b946f688f-rqvbb\" (UID: \"02e5400d-5c59-42f1-b5a9-08b5a2f449fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppp89\" (UniqueName: \"kubernetes.io/projected/02e5400d-5c59-42f1-b5a9-08b5a2f449fe-kube-api-access-ppp89\") pod \"managed-serviceaccount-addon-agent-6b946f688f-rqvbb\" (UID: \"02e5400d-5c59-42f1-b5a9-08b5a2f449fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911281 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnzs\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-kube-api-access-ttnzs\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:15.911293 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17562fa5-f0c9-4d56-b775-49c5c552b472-tmp\") pod \"klusterlet-addon-workmgr-cb4bb4dbb-tzdj4\" (UID: \"17562fa5-f0c9-4d56-b775-49c5c552b472\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:15.911311 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-685b9dc9fd-l8klc: secret "image-registry-tls" not found Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911330 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/17562fa5-f0c9-4d56-b775-49c5c552b472-klusterlet-config\") pod \"klusterlet-addon-workmgr-cb4bb4dbb-tzdj4\" (UID: \"17562fa5-f0c9-4d56-b775-49c5c552b472\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:15.911375 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls podName:d9517991-a0f5-431a-9763-7400c577640c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:16.411356493 +0000 UTC m=+33.018258591 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls") pod "image-registry-685b9dc9fd-l8klc" (UID: "d9517991-a0f5-431a-9763-7400c577640c") : secret "image-registry-tls" not found Apr 21 16:02:15.911462 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911455 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-registry-certificates\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chq8l\" (UniqueName: \"kubernetes.io/projected/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-kube-api-access-chq8l\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911570 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nptzd\" (UniqueName: \"kubernetes.io/projected/17562fa5-f0c9-4d56-b775-49c5c552b472-kube-api-access-nptzd\") pod \"klusterlet-addon-workmgr-cb4bb4dbb-tzdj4\" (UID: \"17562fa5-f0c9-4d56-b775-49c5c552b472\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-config-volume\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911668 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-ca\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911759 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-trusted-ca\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911791 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-tmp-dir\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911847 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wn5vh\" (UniqueName: \"kubernetes.io/projected/67e3a2b6-e556-409b-98e0-59a2af74476f-kube-api-access-wn5vh\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911892 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pswwr\" (UniqueName: \"kubernetes.io/projected/89cebd20-5145-4196-8844-826bc1ec6662-kube-api-access-pswwr\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911933 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-hub\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.911957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/67e3a2b6-e556-409b-98e0-59a2af74476f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.912049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/17562fa5-f0c9-4d56-b775-49c5c552b472-tmp\") pod \"klusterlet-addon-workmgr-cb4bb4dbb-tzdj4\" (UID: \"17562fa5-f0c9-4d56-b775-49c5c552b472\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:15.912144 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:15.912197 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:15.912190 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert podName:89cebd20-5145-4196-8844-826bc1ec6662 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:16.412172538 +0000 UTC m=+33.019074635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert") pod "ingress-canary-bqkbv" (UID: "89cebd20-5145-4196-8844-826bc1ec6662") : secret "canary-serving-cert" not found Apr 21 16:02:15.913182 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.912885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/67e3a2b6-e556-409b-98e0-59a2af74476f-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.916998 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.916975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-image-registry-private-configuration\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.917096 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.917065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-installation-pull-secrets\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.917367 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.917347 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-hub\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.917446 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.917353 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/02e5400d-5c59-42f1-b5a9-08b5a2f449fe-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6b946f688f-rqvbb\" (UID: \"02e5400d-5c59-42f1-b5a9-08b5a2f449fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" Apr 21 16:02:15.917566 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.917543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/17562fa5-f0c9-4d56-b775-49c5c552b472-klusterlet-config\") pod \"klusterlet-addon-workmgr-cb4bb4dbb-tzdj4\" (UID: \"17562fa5-f0c9-4d56-b775-49c5c552b472\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:15.917842 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.917821 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.917944 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.917862 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-ca\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.918499 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.918480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/67e3a2b6-e556-409b-98e0-59a2af74476f-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.927131 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.927088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nptzd\" (UniqueName: \"kubernetes.io/projected/17562fa5-f0c9-4d56-b775-49c5c552b472-kube-api-access-nptzd\") pod \"klusterlet-addon-workmgr-cb4bb4dbb-tzdj4\" (UID: \"17562fa5-f0c9-4d56-b775-49c5c552b472\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:15.929466 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.929424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-bound-sa-token\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.929702 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.929678 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnzs\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-kube-api-access-ttnzs\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:15.930098 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.930058 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppp89\" (UniqueName: \"kubernetes.io/projected/02e5400d-5c59-42f1-b5a9-08b5a2f449fe-kube-api-access-ppp89\") pod \"managed-serviceaccount-addon-agent-6b946f688f-rqvbb\" (UID: \"02e5400d-5c59-42f1-b5a9-08b5a2f449fe\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" Apr 21 16:02:15.939320 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.939287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn5vh\" (UniqueName: \"kubernetes.io/projected/67e3a2b6-e556-409b-98e0-59a2af74476f-kube-api-access-wn5vh\") pod \"cluster-proxy-proxy-agent-67dc99f99-9jzzs\" (UID: \"67e3a2b6-e556-409b-98e0-59a2af74476f\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:15.941763 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:15.941743 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pswwr\" (UniqueName: \"kubernetes.io/projected/89cebd20-5145-4196-8844-826bc1ec6662-kube-api-access-pswwr\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:16.013135 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.013099 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-tmp-dir\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:16.013298 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.013283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:16.013375 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.013311 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chq8l\" (UniqueName: \"kubernetes.io/projected/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-kube-api-access-chq8l\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:16.013375 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.013357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-config-volume\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:16.013466 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:16.013434 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:16.013531 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:16.013504 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls podName:22292ed9-e7f3-49e8-8973-30a2e2fa17a2 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:16.513484621 +0000 UTC m=+33.120386721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls") pod "dns-default-mp77f" (UID: "22292ed9-e7f3-49e8-8973-30a2e2fa17a2") : secret "dns-default-metrics-tls" not found Apr 21 16:02:16.013531 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.013512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-tmp-dir\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:16.018382 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.018356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-config-volume\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:16.022710 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.022687 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chq8l\" (UniqueName: \"kubernetes.io/projected/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-kube-api-access-chq8l\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:16.027737 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.027708 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:02:16.036094 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.036071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:16.042820 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.042798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" Apr 21 16:02:16.417249 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.417211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:16.418024 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.417307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:16.418024 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:16.417391 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 16:02:16.418024 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:16.417414 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-685b9dc9fd-l8klc: secret "image-registry-tls" not found Apr 21 16:02:16.418024 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:16.417435 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:16.418024 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:16.417498 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls podName:d9517991-a0f5-431a-9763-7400c577640c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:17.417477608 +0000 UTC m=+34.024379697 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls") pod "image-registry-685b9dc9fd-l8klc" (UID: "d9517991-a0f5-431a-9763-7400c577640c") : secret "image-registry-tls" not found Apr 21 16:02:16.418024 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:16.417517 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert podName:89cebd20-5145-4196-8844-826bc1ec6662 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:17.417508475 +0000 UTC m=+34.024410563 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert") pod "ingress-canary-bqkbv" (UID: "89cebd20-5145-4196-8844-826bc1ec6662") : secret "canary-serving-cert" not found Apr 21 16:02:16.518075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:16.518034 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:16.518232 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:16.518209 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:16.518302 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:16.518292 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls podName:22292ed9-e7f3-49e8-8973-30a2e2fa17a2 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:17.518277707 +0000 UTC m=+34.125179796 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls") pod "dns-default-mp77f" (UID: "22292ed9-e7f3-49e8-8973-30a2e2fa17a2") : secret "dns-default-metrics-tls" not found Apr 21 16:02:17.050250 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.050213 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:17.050250 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.050212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:17.050576 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.050209 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:17.053912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.053884 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cz78x\"" Apr 21 16:02:17.054278 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.054258 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 16:02:17.054401 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.054295 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 16:02:17.054401 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.054295 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 16:02:17.054401 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.054335 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 16:02:17.054757 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.054740 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jk8gf\"" Apr 21 16:02:17.426304 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.426265 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:17.426718 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.426352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:17.426718 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:17.426428 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 16:02:17.426718 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:17.426452 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-685b9dc9fd-l8klc: secret "image-registry-tls" not found Apr 21 16:02:17.426718 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:17.426495 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:17.426718 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:17.426516 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls podName:d9517991-a0f5-431a-9763-7400c577640c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:19.42649641 +0000 UTC m=+36.033398503 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls") pod "image-registry-685b9dc9fd-l8klc" (UID: "d9517991-a0f5-431a-9763-7400c577640c") : secret "image-registry-tls" not found Apr 21 16:02:17.426718 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:17.426571 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert podName:89cebd20-5145-4196-8844-826bc1ec6662 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:19.42655839 +0000 UTC m=+36.033460493 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert") pod "ingress-canary-bqkbv" (UID: "89cebd20-5145-4196-8844-826bc1ec6662") : secret "canary-serving-cert" not found Apr 21 16:02:17.526974 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.526934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:17.527155 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:17.527107 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:17.527221 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:17.527183 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls podName:22292ed9-e7f3-49e8-8973-30a2e2fa17a2 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:19.527164274 +0000 UTC m=+36.134066361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls") pod "dns-default-mp77f" (UID: "22292ed9-e7f3-49e8-8973-30a2e2fa17a2") : secret "dns-default-metrics-tls" not found Apr 21 16:02:17.728635 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.728390 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:17.728833 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:17.728571 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 16:02:17.728899 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:17.728893 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs podName:4c538f88-ee19-454d-9d5b-09dd0a3ed71b nodeName:}" failed. No retries permitted until 2026-04-21 16:02:49.728870753 +0000 UTC m=+66.335772862 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs") pod "network-metrics-daemon-qhl4l" (UID: "4c538f88-ee19-454d-9d5b-09dd0a3ed71b") : secret "metrics-daemon-secret" not found Apr 21 16:02:17.826547 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.826410 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4"] Apr 21 16:02:17.827602 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.827576 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb"] Apr 21 16:02:17.828240 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.828219 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs"] Apr 21 16:02:17.829316 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.829296 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs864\" (UniqueName: \"kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864\") pod \"network-check-target-8gq94\" (UID: \"92ed2261-b49e-45eb-9081-57d883cfcf5a\") " pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:17.832626 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.832608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs864\" (UniqueName: \"kubernetes.io/projected/92ed2261-b49e-45eb-9081-57d883cfcf5a-kube-api-access-gs864\") pod \"network-check-target-8gq94\" (UID: \"92ed2261-b49e-45eb-9081-57d883cfcf5a\") " pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:17.881196 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:02:17.881141 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02e5400d_5c59_42f1_b5a9_08b5a2f449fe.slice/crio-4fb664c338341f9f387939fc3ffac5e3df72834766a6237d42973599016560bb WatchSource:0}: Error finding container 4fb664c338341f9f387939fc3ffac5e3df72834766a6237d42973599016560bb: Status 404 returned error can't find the container with id 4fb664c338341f9f387939fc3ffac5e3df72834766a6237d42973599016560bb Apr 21 16:02:17.881715 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:02:17.881690 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17562fa5_f0c9_4d56_b775_49c5c552b472.slice/crio-c34eacb14bd1bc5760187fc5e1c10e660d06e99eb281a0ab0147278be27f5e57 WatchSource:0}: Error finding container c34eacb14bd1bc5760187fc5e1c10e660d06e99eb281a0ab0147278be27f5e57: Status 404 returned error can't find the container with id c34eacb14bd1bc5760187fc5e1c10e660d06e99eb281a0ab0147278be27f5e57 Apr 21 16:02:17.882215 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:02:17.882191 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67e3a2b6_e556_409b_98e0_59a2af74476f.slice/crio-3dd7b99d8c041d4b83f5b36cd95fb5451efdfd564e6cf7a1fd4753900718c7dc WatchSource:0}: Error finding container 3dd7b99d8c041d4b83f5b36cd95fb5451efdfd564e6cf7a1fd4753900718c7dc: Status 404 returned error can't find the container with id 3dd7b99d8c041d4b83f5b36cd95fb5451efdfd564e6cf7a1fd4753900718c7dc Apr 21 16:02:17.964017 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:17.963986 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:18.104822 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:18.104753 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8gq94"] Apr 21 16:02:18.108210 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:02:18.108185 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92ed2261_b49e_45eb_9081_57d883cfcf5a.slice/crio-d98236c3393f1ed9afd6f811421c32449c50dda74c7db0efef1b25ef1bb135c2 WatchSource:0}: Error finding container d98236c3393f1ed9afd6f811421c32449c50dda74c7db0efef1b25ef1bb135c2: Status 404 returned error can't find the container with id d98236c3393f1ed9afd6f811421c32449c50dda74c7db0efef1b25ef1bb135c2 Apr 21 16:02:18.198273 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:18.198241 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" event={"ID":"02e5400d-5c59-42f1-b5a9-08b5a2f449fe","Type":"ContainerStarted","Data":"4fb664c338341f9f387939fc3ffac5e3df72834766a6237d42973599016560bb"} Apr 21 16:02:18.199345 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:18.199322 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8gq94" event={"ID":"92ed2261-b49e-45eb-9081-57d883cfcf5a","Type":"ContainerStarted","Data":"d98236c3393f1ed9afd6f811421c32449c50dda74c7db0efef1b25ef1bb135c2"} Apr 21 16:02:18.200185 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:18.200159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" event={"ID":"17562fa5-f0c9-4d56-b775-49c5c552b472","Type":"ContainerStarted","Data":"c34eacb14bd1bc5760187fc5e1c10e660d06e99eb281a0ab0147278be27f5e57"} Apr 21 16:02:18.202609 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:18.202589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jszj" event={"ID":"7640279c-4346-4f46-b222-3c04e5d7569e","Type":"ContainerStarted","Data":"18059ea65ef0956693cf380fdb0c7ec93c0460076228c2765ef7344a4ce2ccec"} Apr 21 16:02:18.203506 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:18.203482 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" event={"ID":"67e3a2b6-e556-409b-98e0-59a2af74476f","Type":"ContainerStarted","Data":"3dd7b99d8c041d4b83f5b36cd95fb5451efdfd564e6cf7a1fd4753900718c7dc"} Apr 21 16:02:19.211561 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:19.211523 2577 generic.go:358] "Generic (PLEG): container finished" podID="7640279c-4346-4f46-b222-3c04e5d7569e" containerID="18059ea65ef0956693cf380fdb0c7ec93c0460076228c2765ef7344a4ce2ccec" exitCode=0 Apr 21 16:02:19.212029 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:19.211590 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jszj" event={"ID":"7640279c-4346-4f46-b222-3c04e5d7569e","Type":"ContainerDied","Data":"18059ea65ef0956693cf380fdb0c7ec93c0460076228c2765ef7344a4ce2ccec"} Apr 21 16:02:19.446970 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:19.446938 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:19.447114 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:19.447021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:19.447199 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:19.447182 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:19.447271 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:19.447259 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert podName:89cebd20-5145-4196-8844-826bc1ec6662 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:23.447239433 +0000 UTC m=+40.054141523 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert") pod "ingress-canary-bqkbv" (UID: "89cebd20-5145-4196-8844-826bc1ec6662") : secret "canary-serving-cert" not found Apr 21 16:02:19.447811 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:19.447792 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 16:02:19.447811 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:19.447813 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-685b9dc9fd-l8klc: secret "image-registry-tls" not found Apr 21 16:02:19.447975 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:19.447855 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls podName:d9517991-a0f5-431a-9763-7400c577640c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:23.44784192 +0000 UTC m=+40.054744020 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls") pod "image-registry-685b9dc9fd-l8klc" (UID: "d9517991-a0f5-431a-9763-7400c577640c") : secret "image-registry-tls" not found Apr 21 16:02:19.548669 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:19.547709 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:19.548669 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:19.548308 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:19.548669 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:19.548377 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls podName:22292ed9-e7f3-49e8-8973-30a2e2fa17a2 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:23.548358784 +0000 UTC m=+40.155260873 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls") pod "dns-default-mp77f" (UID: "22292ed9-e7f3-49e8-8973-30a2e2fa17a2") : secret "dns-default-metrics-tls" not found Apr 21 16:02:20.225484 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:20.225443 2577 generic.go:358] "Generic (PLEG): container finished" podID="7640279c-4346-4f46-b222-3c04e5d7569e" containerID="d8e2abe12abae90aa35f52f13d59ec8ed6edd359416b603c4992f4c0bcb72dfb" exitCode=0 Apr 21 16:02:20.225990 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:20.225575 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jszj" event={"ID":"7640279c-4346-4f46-b222-3c04e5d7569e","Type":"ContainerDied","Data":"d8e2abe12abae90aa35f52f13d59ec8ed6edd359416b603c4992f4c0bcb72dfb"} Apr 21 16:02:20.964788 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:20.964728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:20.972681 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:20.972648 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27182b2f-30c8-4cef-90b0-b91b4f04047a-original-pull-secret\") pod \"global-pull-secret-syncer-fmhv9\" (UID: \"27182b2f-30c8-4cef-90b0-b91b4f04047a\") " pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:21.271124 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:21.271084 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmhv9" Apr 21 16:02:23.481186 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:23.481147 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:23.481186 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:23.481199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:23.481849 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:23.481304 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:23.481849 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:23.481321 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 16:02:23.481849 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:23.481348 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-685b9dc9fd-l8klc: secret "image-registry-tls" not found Apr 21 16:02:23.481849 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:23.481384 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert podName:89cebd20-5145-4196-8844-826bc1ec6662 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:31.481368439 +0000 UTC m=+48.088270524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert") pod "ingress-canary-bqkbv" (UID: "89cebd20-5145-4196-8844-826bc1ec6662") : secret "canary-serving-cert" not found Apr 21 16:02:23.481849 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:23.481397 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls podName:d9517991-a0f5-431a-9763-7400c577640c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:31.481391625 +0000 UTC m=+48.088293710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls") pod "image-registry-685b9dc9fd-l8klc" (UID: "d9517991-a0f5-431a-9763-7400c577640c") : secret "image-registry-tls" not found Apr 21 16:02:23.581994 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:23.581958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:23.582159 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:23.582089 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:23.582159 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:23.582148 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls podName:22292ed9-e7f3-49e8-8973-30a2e2fa17a2 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:31.582134465 +0000 UTC m=+48.189036550 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls") pod "dns-default-mp77f" (UID: "22292ed9-e7f3-49e8-8973-30a2e2fa17a2") : secret "dns-default-metrics-tls" not found Apr 21 16:02:25.676384 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:25.676359 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fmhv9"] Apr 21 16:02:25.681275 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:02:25.681251 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27182b2f_30c8_4cef_90b0_b91b4f04047a.slice/crio-59a7b82efb741dc7c1a6056c37a4a736a0f850b6dab648da6e2a51b5387d4249 WatchSource:0}: Error finding container 59a7b82efb741dc7c1a6056c37a4a736a0f850b6dab648da6e2a51b5387d4249: Status 404 returned error can't find the container with id 59a7b82efb741dc7c1a6056c37a4a736a0f850b6dab648da6e2a51b5387d4249 Apr 21 16:02:26.239958 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.239922 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8gq94" event={"ID":"92ed2261-b49e-45eb-9081-57d883cfcf5a","Type":"ContainerStarted","Data":"e65c349e8b9d1f9cabe41a522a9b098541aa72a8c026ab0d7b3b2d180821241f"} Apr 21 16:02:26.240163 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.240066 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:02:26.241341 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.241320 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" event={"ID":"17562fa5-f0c9-4d56-b775-49c5c552b472","Type":"ContainerStarted","Data":"d3e9cb680c8baf20353c7a45e6773db03da88c9b3fef5c8d905f47faf5aeec74"} Apr 21 16:02:26.241514 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.241484 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:26.243580 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.243551 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:02:26.244738 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.244711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8jszj" event={"ID":"7640279c-4346-4f46-b222-3c04e5d7569e","Type":"ContainerStarted","Data":"e74577a67a1e6129188faff12b5c349057c4596f84582312a92b8b062d683b6e"} Apr 21 16:02:26.246019 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.246001 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" event={"ID":"67e3a2b6-e556-409b-98e0-59a2af74476f","Type":"ContainerStarted","Data":"408b31d7aa38f39cafe58ed9b5892e782ffd62cbadc00a17d3e821ba6f6280f3"} Apr 21 16:02:26.247334 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.247309 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" event={"ID":"02e5400d-5c59-42f1-b5a9-08b5a2f449fe","Type":"ContainerStarted","Data":"06a471c5a8dd59a2d640f2f365b995f81be12bf4ef94b464ff7754f30b7c433e"} Apr 21 16:02:26.248287 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.248266 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fmhv9" event={"ID":"27182b2f-30c8-4cef-90b0-b91b4f04047a","Type":"ContainerStarted","Data":"59a7b82efb741dc7c1a6056c37a4a736a0f850b6dab648da6e2a51b5387d4249"} Apr 21 16:02:26.263571 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.263520 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-8gq94" podStartSLOduration=34.83365104 podStartE2EDuration="42.263505708s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:02:18.110715224 +0000 UTC m=+34.717617309" lastFinishedPulling="2026-04-21 16:02:25.540569879 +0000 UTC m=+42.147471977" observedRunningTime="2026-04-21 16:02:26.263310615 +0000 UTC m=+42.870212725" watchObservedRunningTime="2026-04-21 16:02:26.263505708 +0000 UTC m=+42.870407816" Apr 21 16:02:26.283250 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.283184 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" podStartSLOduration=18.63636901 podStartE2EDuration="26.283165392s" podCreationTimestamp="2026-04-21 16:02:00 +0000 UTC" firstStartedPulling="2026-04-21 16:02:17.893003486 +0000 UTC m=+34.499905570" lastFinishedPulling="2026-04-21 16:02:25.539799867 +0000 UTC m=+42.146701952" observedRunningTime="2026-04-21 16:02:26.283047856 +0000 UTC m=+42.889949964" watchObservedRunningTime="2026-04-21 16:02:26.283165392 +0000 UTC m=+42.890067501" Apr 21 16:02:26.306740 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.306684 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" podStartSLOduration=18.64279552 podStartE2EDuration="26.306669343s" podCreationTimestamp="2026-04-21 16:02:00 +0000 UTC" firstStartedPulling="2026-04-21 16:02:17.892896612 +0000 UTC m=+34.499798700" lastFinishedPulling="2026-04-21 16:02:25.556770427 +0000 UTC m=+42.163672523" observedRunningTime="2026-04-21 16:02:26.305371112 +0000 UTC m=+42.912273218" watchObservedRunningTime="2026-04-21 16:02:26.306669343 +0000 UTC m=+42.913571452" Apr 21 16:02:26.339749 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:26.339696 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8jszj" podStartSLOduration=11.308771761 podStartE2EDuration="42.33968394s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:01:46.885830224 +0000 UTC m=+3.492732322" lastFinishedPulling="2026-04-21 16:02:17.916742403 +0000 UTC m=+34.523644501" observedRunningTime="2026-04-21 16:02:26.339392316 +0000 UTC m=+42.946294424" watchObservedRunningTime="2026-04-21 16:02:26.33968394 +0000 UTC m=+42.946586046" Apr 21 16:02:30.261071 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:30.260997 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" event={"ID":"67e3a2b6-e556-409b-98e0-59a2af74476f","Type":"ContainerStarted","Data":"08857133fd9133ba1bbdd695d4bb90ed1af0221791797a89b05731b3dc7535b7"} Apr 21 16:02:30.261071 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:30.261043 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" event={"ID":"67e3a2b6-e556-409b-98e0-59a2af74476f","Type":"ContainerStarted","Data":"f324b88e29f7d1d295163b0f885da69d8ed9b0d98ac1cdc79a0bf6fc69ef648a"} Apr 21 16:02:30.264030 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:30.263999 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fmhv9" event={"ID":"27182b2f-30c8-4cef-90b0-b91b4f04047a","Type":"ContainerStarted","Data":"2bac6ac001e9cdae162363fb1b2d96052ac5799d9d7e48ebeb6cb354d7cad0d2"} Apr 21 16:02:30.291075 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:30.290962 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" podStartSLOduration=18.136531357 podStartE2EDuration="30.29094402s" podCreationTimestamp="2026-04-21 16:02:00 +0000 UTC" firstStartedPulling="2026-04-21 16:02:17.892886126 +0000 UTC m=+34.499788226" lastFinishedPulling="2026-04-21 16:02:30.047298802 +0000 UTC m=+46.654200889" observedRunningTime="2026-04-21 16:02:30.290380535 +0000 UTC m=+46.897282643" watchObservedRunningTime="2026-04-21 16:02:30.29094402 +0000 UTC m=+46.897846128" Apr 21 16:02:30.310124 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:30.310074 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fmhv9" podStartSLOduration=36.935050814 podStartE2EDuration="41.310059277s" podCreationTimestamp="2026-04-21 16:01:49 +0000 UTC" firstStartedPulling="2026-04-21 16:02:25.683689674 +0000 UTC m=+42.290591773" lastFinishedPulling="2026-04-21 16:02:30.058698137 +0000 UTC m=+46.665600236" observedRunningTime="2026-04-21 16:02:30.309668483 +0000 UTC m=+46.916570592" watchObservedRunningTime="2026-04-21 16:02:30.310059277 +0000 UTC m=+46.916961387" Apr 21 16:02:31.544713 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:31.544656 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:31.545142 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:31.544746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:31.545142 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:31.544824 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:31.545142 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:31.544859 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 16:02:31.545142 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:31.544870 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-685b9dc9fd-l8klc: secret "image-registry-tls" not found Apr 21 16:02:31.545142 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:31.544885 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert podName:89cebd20-5145-4196-8844-826bc1ec6662 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:47.544871181 +0000 UTC m=+64.151773266 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert") pod "ingress-canary-bqkbv" (UID: "89cebd20-5145-4196-8844-826bc1ec6662") : secret "canary-serving-cert" not found Apr 21 16:02:31.545385 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:31.545263 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls podName:d9517991-a0f5-431a-9763-7400c577640c nodeName:}" failed. No retries permitted until 2026-04-21 16:02:47.54524043 +0000 UTC m=+64.152142545 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls") pod "image-registry-685b9dc9fd-l8klc" (UID: "d9517991-a0f5-431a-9763-7400c577640c") : secret "image-registry-tls" not found Apr 21 16:02:31.645745 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:31.645708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:31.645972 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:31.645896 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:31.646033 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:31.645975 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls podName:22292ed9-e7f3-49e8-8973-30a2e2fa17a2 nodeName:}" failed. No retries permitted until 2026-04-21 16:02:47.645955422 +0000 UTC m=+64.252857507 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls") pod "dns-default-mp77f" (UID: "22292ed9-e7f3-49e8-8973-30a2e2fa17a2") : secret "dns-default-metrics-tls" not found Apr 21 16:02:43.199331 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:43.199295 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6rkhl" Apr 21 16:02:47.569580 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:47.569526 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:02:47.569989 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:47.569597 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:02:47.569989 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:47.569674 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 16:02:47.569989 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:47.569694 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-685b9dc9fd-l8klc: secret "image-registry-tls" not found Apr 21 16:02:47.569989 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:47.569696 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:02:47.569989 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:47.569750 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls podName:d9517991-a0f5-431a-9763-7400c577640c nodeName:}" failed. No retries permitted until 2026-04-21 16:03:19.569734297 +0000 UTC m=+96.176636385 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls") pod "image-registry-685b9dc9fd-l8klc" (UID: "d9517991-a0f5-431a-9763-7400c577640c") : secret "image-registry-tls" not found Apr 21 16:02:47.569989 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:47.569762 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert podName:89cebd20-5145-4196-8844-826bc1ec6662 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:19.56975709 +0000 UTC m=+96.176659175 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert") pod "ingress-canary-bqkbv" (UID: "89cebd20-5145-4196-8844-826bc1ec6662") : secret "canary-serving-cert" not found Apr 21 16:02:47.670293 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:47.670261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:02:47.670453 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:47.670403 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:02:47.670492 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:47.670463 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls podName:22292ed9-e7f3-49e8-8973-30a2e2fa17a2 nodeName:}" failed. No retries permitted until 2026-04-21 16:03:19.670448015 +0000 UTC m=+96.277350101 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls") pod "dns-default-mp77f" (UID: "22292ed9-e7f3-49e8-8973-30a2e2fa17a2") : secret "dns-default-metrics-tls" not found Apr 21 16:02:49.785220 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:49.785185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:02:49.785583 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:49.785327 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 16:02:49.785583 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:02:49.785378 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs podName:4c538f88-ee19-454d-9d5b-09dd0a3ed71b nodeName:}" failed. No retries permitted until 2026-04-21 16:03:53.78536544 +0000 UTC m=+130.392267525 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs") pod "network-metrics-daemon-qhl4l" (UID: "4c538f88-ee19-454d-9d5b-09dd0a3ed71b") : secret "metrics-daemon-secret" not found Apr 21 16:02:57.254715 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:02:57.254679 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-8gq94" Apr 21 16:03:19.612973 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:03:19.612817 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:03:19.612973 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:03:19.612915 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:03:19.613491 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:03:19.612982 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 16:03:19.613491 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:03:19.613020 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 16:03:19.613491 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:03:19.613035 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-685b9dc9fd-l8klc: secret "image-registry-tls" not found Apr 21 16:03:19.613491 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:03:19.613056 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert podName:89cebd20-5145-4196-8844-826bc1ec6662 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:23.613038267 +0000 UTC m=+160.219940362 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert") pod "ingress-canary-bqkbv" (UID: "89cebd20-5145-4196-8844-826bc1ec6662") : secret "canary-serving-cert" not found Apr 21 16:03:19.613491 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:03:19.613096 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls podName:d9517991-a0f5-431a-9763-7400c577640c nodeName:}" failed. No retries permitted until 2026-04-21 16:04:23.613079605 +0000 UTC m=+160.219981692 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls") pod "image-registry-685b9dc9fd-l8klc" (UID: "d9517991-a0f5-431a-9763-7400c577640c") : secret "image-registry-tls" not found Apr 21 16:03:19.714285 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:03:19.714237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:03:19.714409 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:03:19.714388 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 16:03:19.714466 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:03:19.714456 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls podName:22292ed9-e7f3-49e8-8973-30a2e2fa17a2 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:23.714440118 +0000 UTC m=+160.321342203 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls") pod "dns-default-mp77f" (UID: "22292ed9-e7f3-49e8-8973-30a2e2fa17a2") : secret "dns-default-metrics-tls" not found Apr 21 16:03:53.860649 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:03:53.860610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:03:53.861162 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:03:53.860750 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 16:03:53.861162 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:03:53.860831 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs podName:4c538f88-ee19-454d-9d5b-09dd0a3ed71b nodeName:}" failed. No retries permitted until 2026-04-21 16:05:55.860816078 +0000 UTC m=+252.467718169 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs") pod "network-metrics-daemon-qhl4l" (UID: "4c538f88-ee19-454d-9d5b-09dd0a3ed71b") : secret "metrics-daemon-secret" not found Apr 21 16:03:59.388698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:03:59.388665 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-86kzc_4701d99b-4f05-4410-bf15-f5fd3e3bf5bf/dns-node-resolver/0.log" Apr 21 16:03:59.975404 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:03:59.975374 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8flks_d74dfe95-7664-4441-8273-cc22ee22d89f/node-ca/0.log" Apr 21 16:04:18.766471 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:04:18.766431 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" podUID="d9517991-a0f5-431a-9763-7400c577640c" Apr 21 16:04:18.772588 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:04:18.772557 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bqkbv" podUID="89cebd20-5145-4196-8844-826bc1ec6662" Apr 21 16:04:18.824055 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:04:18.824018 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-mp77f" podUID="22292ed9-e7f3-49e8-8973-30a2e2fa17a2" Apr 21 16:04:19.518896 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:19.518864 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mp77f" Apr 21 16:04:19.518896 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:19.518891 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:04:20.079628 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:04:20.079589 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-qhl4l" podUID="4c538f88-ee19-454d-9d5b-09dd0a3ed71b" Apr 21 16:04:23.686432 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:23.686381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:04:23.686432 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:23.686446 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:04:23.688880 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:23.688858 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"image-registry-685b9dc9fd-l8klc\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:04:23.688959 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:23.688934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89cebd20-5145-4196-8844-826bc1ec6662-cert\") pod \"ingress-canary-bqkbv\" (UID: \"89cebd20-5145-4196-8844-826bc1ec6662\") " pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:04:23.723070 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:23.723042 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-76wnm\"" Apr 21 16:04:23.730493 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:23.730471 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bqkbv" Apr 21 16:04:23.787604 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:23.787572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:04:23.790513 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:23.790491 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/22292ed9-e7f3-49e8-8973-30a2e2fa17a2-metrics-tls\") pod \"dns-default-mp77f\" (UID: \"22292ed9-e7f3-49e8-8973-30a2e2fa17a2\") " pod="openshift-dns/dns-default-mp77f" Apr 21 16:04:23.847058 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:23.847023 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bqkbv"] Apr 21 16:04:23.849872 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:04:23.849840 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89cebd20_5145_4196_8844_826bc1ec6662.slice/crio-dfed7155bb1e9a309e55430444164012448c686962302f12bab722acaae73c2a WatchSource:0}: Error finding container dfed7155bb1e9a309e55430444164012448c686962302f12bab722acaae73c2a: Status 404 returned error can't find the container with id dfed7155bb1e9a309e55430444164012448c686962302f12bab722acaae73c2a Apr 21 16:04:24.022342 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:24.022265 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2l66p\"" Apr 21 16:04:24.029851 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:24.029833 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mp77f" Apr 21 16:04:24.156727 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:24.156705 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mp77f"] Apr 21 16:04:24.158511 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:04:24.158485 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22292ed9_e7f3_49e8_8973_30a2e2fa17a2.slice/crio-f76732f445e258710193cc1793f6323749b287235059b48738981d63487d1db4 WatchSource:0}: Error finding container f76732f445e258710193cc1793f6323749b287235059b48738981d63487d1db4: Status 404 returned error can't find the container with id f76732f445e258710193cc1793f6323749b287235059b48738981d63487d1db4 Apr 21 16:04:24.531470 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:24.531429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bqkbv" event={"ID":"89cebd20-5145-4196-8844-826bc1ec6662","Type":"ContainerStarted","Data":"dfed7155bb1e9a309e55430444164012448c686962302f12bab722acaae73c2a"} Apr 21 16:04:24.532556 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:24.532528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mp77f" event={"ID":"22292ed9-e7f3-49e8-8973-30a2e2fa17a2","Type":"ContainerStarted","Data":"f76732f445e258710193cc1793f6323749b287235059b48738981d63487d1db4"} Apr 21 16:04:26.036636 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.036458 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" podUID="17562fa5-f0c9-4d56-b775-49c5c552b472" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.7:8000/healthz\": dial tcp 10.132.0.7:8000: connect: connection refused" Apr 21 16:04:26.043727 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.043703 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" podUID="02e5400d-5c59-42f1-b5a9-08b5a2f449fe" containerName="addon-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/healthz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 21 16:04:26.242185 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.242095 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" podUID="17562fa5-f0c9-4d56-b775-49c5c552b472" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.7:8000/readyz\": dial tcp 10.132.0.7:8000: connect: connection refused" Apr 21 16:04:26.539497 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.539413 2577 generic.go:358] "Generic (PLEG): container finished" podID="02e5400d-5c59-42f1-b5a9-08b5a2f449fe" containerID="06a471c5a8dd59a2d640f2f365b995f81be12bf4ef94b464ff7754f30b7c433e" exitCode=255 Apr 21 16:04:26.539645 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.539487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" event={"ID":"02e5400d-5c59-42f1-b5a9-08b5a2f449fe","Type":"ContainerDied","Data":"06a471c5a8dd59a2d640f2f365b995f81be12bf4ef94b464ff7754f30b7c433e"} Apr 21 16:04:26.539915 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.539891 2577 scope.go:117] "RemoveContainer" containerID="06a471c5a8dd59a2d640f2f365b995f81be12bf4ef94b464ff7754f30b7c433e" Apr 21 16:04:26.540720 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.540692 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bqkbv" event={"ID":"89cebd20-5145-4196-8844-826bc1ec6662","Type":"ContainerStarted","Data":"83e93df9ebc18e66948b4336dec6650a6c585807890a39551622c5a4dba0f3b9"} Apr 21 16:04:26.541928 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.541909 2577 generic.go:358] "Generic (PLEG): container finished" podID="17562fa5-f0c9-4d56-b775-49c5c552b472" containerID="d3e9cb680c8baf20353c7a45e6773db03da88c9b3fef5c8d905f47faf5aeec74" exitCode=1 Apr 21 16:04:26.542092 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.541965 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" event={"ID":"17562fa5-f0c9-4d56-b775-49c5c552b472","Type":"ContainerDied","Data":"d3e9cb680c8baf20353c7a45e6773db03da88c9b3fef5c8d905f47faf5aeec74"} Apr 21 16:04:26.542281 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.542268 2577 scope.go:117] "RemoveContainer" containerID="d3e9cb680c8baf20353c7a45e6773db03da88c9b3fef5c8d905f47faf5aeec74" Apr 21 16:04:26.543590 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.543574 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mp77f" event={"ID":"22292ed9-e7f3-49e8-8973-30a2e2fa17a2","Type":"ContainerStarted","Data":"db2f5127dfcbc4ab3ee8923ef906c9c62dab162b83cf89c26a783dd236779c13"} Apr 21 16:04:26.543655 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.543595 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mp77f" event={"ID":"22292ed9-e7f3-49e8-8973-30a2e2fa17a2","Type":"ContainerStarted","Data":"63620764ced07c7a19821f4524c61c79ae9c884e328a401aa13e6a26d49ec70b"} Apr 21 16:04:26.543698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.543691 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-mp77f" Apr 21 16:04:26.654756 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.654703 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bqkbv" podStartSLOduration=129.816577096 podStartE2EDuration="2m11.654684174s" podCreationTimestamp="2026-04-21 16:02:15 +0000 UTC" firstStartedPulling="2026-04-21 16:04:23.851640449 +0000 UTC m=+160.458542539" lastFinishedPulling="2026-04-21 16:04:25.689747519 +0000 UTC m=+162.296649617" observedRunningTime="2026-04-21 16:04:26.65406319 +0000 UTC m=+163.260965298" watchObservedRunningTime="2026-04-21 16:04:26.654684174 +0000 UTC m=+163.261586280" Apr 21 16:04:26.694351 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:26.694300 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mp77f" podStartSLOduration=130.163322867 podStartE2EDuration="2m11.694283272s" podCreationTimestamp="2026-04-21 16:02:15 +0000 UTC" firstStartedPulling="2026-04-21 16:04:24.160237033 +0000 UTC m=+160.767139119" lastFinishedPulling="2026-04-21 16:04:25.691197419 +0000 UTC m=+162.298099524" observedRunningTime="2026-04-21 16:04:26.693340816 +0000 UTC m=+163.300242923" watchObservedRunningTime="2026-04-21 16:04:26.694283272 +0000 UTC m=+163.301185381" Apr 21 16:04:27.114382 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.114345 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vfgtx"] Apr 21 16:04:27.116961 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.116945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.120053 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.120025 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 16:04:27.120178 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.120143 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rdjm8\"" Apr 21 16:04:27.121106 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.121091 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 16:04:27.121186 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.121167 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 16:04:27.121222 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.121204 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 16:04:27.145467 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.145438 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vfgtx"] Apr 21 16:04:27.216516 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.216482 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6d6f47c5-fbee-4306-a493-e6ac93a55dac-crio-socket\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.216688 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.216520 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6d6f47c5-fbee-4306-a493-e6ac93a55dac-data-volume\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.216688 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.216565 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7r88\" (UniqueName: \"kubernetes.io/projected/6d6f47c5-fbee-4306-a493-e6ac93a55dac-kube-api-access-f7r88\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.216688 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.216643 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6d6f47c5-fbee-4306-a493-e6ac93a55dac-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.216827 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.216699 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6d6f47c5-fbee-4306-a493-e6ac93a55dac-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.317078 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.317046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7r88\" (UniqueName: \"kubernetes.io/projected/6d6f47c5-fbee-4306-a493-e6ac93a55dac-kube-api-access-f7r88\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.317264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.317098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6d6f47c5-fbee-4306-a493-e6ac93a55dac-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.317264 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.317219 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6d6f47c5-fbee-4306-a493-e6ac93a55dac-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.317368 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.317271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6d6f47c5-fbee-4306-a493-e6ac93a55dac-crio-socket\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.317368 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.317301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6d6f47c5-fbee-4306-a493-e6ac93a55dac-data-volume\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.317368 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.317348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6d6f47c5-fbee-4306-a493-e6ac93a55dac-crio-socket\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.317614 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.317598 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6d6f47c5-fbee-4306-a493-e6ac93a55dac-data-volume\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.318384 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.318365 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6d6f47c5-fbee-4306-a493-e6ac93a55dac-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.319266 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.319246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6d6f47c5-fbee-4306-a493-e6ac93a55dac-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.336510 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.336488 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7r88\" (UniqueName: \"kubernetes.io/projected/6d6f47c5-fbee-4306-a493-e6ac93a55dac-kube-api-access-f7r88\") pod \"insights-runtime-extractor-vfgtx\" (UID: \"6d6f47c5-fbee-4306-a493-e6ac93a55dac\") " pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.426291 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.426184 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vfgtx" Apr 21 16:04:27.547996 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.547962 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" event={"ID":"17562fa5-f0c9-4d56-b775-49c5c552b472","Type":"ContainerStarted","Data":"06ae170695bfc122c4301fed02d24bc7dc87b4e19767afd8c08497c81def61a1"} Apr 21 16:04:27.548302 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.548274 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:04:27.549243 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.549214 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-cb4bb4dbb-tzdj4" Apr 21 16:04:27.549679 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.549655 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6b946f688f-rqvbb" event={"ID":"02e5400d-5c59-42f1-b5a9-08b5a2f449fe","Type":"ContainerStarted","Data":"cf8548826cd69a0552994863444645cec5a3a52456d7685058baa4313be38f96"} Apr 21 16:04:27.579039 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:27.579019 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vfgtx"] Apr 21 16:04:27.582049 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:04:27.582029 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d6f47c5_fbee_4306_a493_e6ac93a55dac.slice/crio-e22be5fc3e57002cbca72aa0d5547109298dda7d7ccb2afb6fb762917a78b245 WatchSource:0}: Error finding container e22be5fc3e57002cbca72aa0d5547109298dda7d7ccb2afb6fb762917a78b245: Status 404 returned error can't find the container with id e22be5fc3e57002cbca72aa0d5547109298dda7d7ccb2afb6fb762917a78b245 Apr 21 16:04:28.553131 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:28.553040 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vfgtx" event={"ID":"6d6f47c5-fbee-4306-a493-e6ac93a55dac","Type":"ContainerStarted","Data":"5e81a143b6a3e0639127ab732a56e46b299f3344cdce836d0367400350d0e3a8"} Apr 21 16:04:28.553131 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:28.553075 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vfgtx" event={"ID":"6d6f47c5-fbee-4306-a493-e6ac93a55dac","Type":"ContainerStarted","Data":"38a98adcc84ba39196c5adf83d637c13910a88c3208033dbda5eb29b1215c9a2"} Apr 21 16:04:28.553131 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:28.553084 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vfgtx" event={"ID":"6d6f47c5-fbee-4306-a493-e6ac93a55dac","Type":"ContainerStarted","Data":"e22be5fc3e57002cbca72aa0d5547109298dda7d7ccb2afb6fb762917a78b245"} Apr 21 16:04:30.559309 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:30.559272 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vfgtx" event={"ID":"6d6f47c5-fbee-4306-a493-e6ac93a55dac","Type":"ContainerStarted","Data":"0adcc4defad4002f3f53708532c605f229a27800067c25dd5ed2d976de2e70a5"} Apr 21 16:04:32.049552 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:32.049516 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:04:32.052375 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:32.052355 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vqm4s\"" Apr 21 16:04:32.060456 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:32.060438 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:04:32.184253 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:32.184200 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vfgtx" podStartSLOduration=2.952651573 podStartE2EDuration="5.184185606s" podCreationTimestamp="2026-04-21 16:04:27 +0000 UTC" firstStartedPulling="2026-04-21 16:04:27.633191459 +0000 UTC m=+164.240093547" lastFinishedPulling="2026-04-21 16:04:29.864725491 +0000 UTC m=+166.471627580" observedRunningTime="2026-04-21 16:04:30.598920885 +0000 UTC m=+167.205822992" watchObservedRunningTime="2026-04-21 16:04:32.184185606 +0000 UTC m=+168.791087718" Apr 21 16:04:32.185175 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:32.185155 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-685b9dc9fd-l8klc"] Apr 21 16:04:32.188847 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:04:32.188819 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9517991_a0f5_431a_9763_7400c577640c.slice/crio-dfc07c1290fd64911797ff5b069bab0d8049c8d26d4c02130d355e7224bad15d WatchSource:0}: Error finding container dfc07c1290fd64911797ff5b069bab0d8049c8d26d4c02130d355e7224bad15d: Status 404 returned error can't find the container with id dfc07c1290fd64911797ff5b069bab0d8049c8d26d4c02130d355e7224bad15d Apr 21 16:04:32.566687 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:32.566657 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" event={"ID":"d9517991-a0f5-431a-9763-7400c577640c","Type":"ContainerStarted","Data":"33dc801dc9b6a834133d9d6a055726f9f9266e10b186f7f746b754df7ca652ab"} Apr 21 16:04:32.566687 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:32.566688 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" event={"ID":"d9517991-a0f5-431a-9763-7400c577640c","Type":"ContainerStarted","Data":"dfc07c1290fd64911797ff5b069bab0d8049c8d26d4c02130d355e7224bad15d"} Apr 21 16:04:32.566958 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:32.566758 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:04:32.593375 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:32.593329 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" podStartSLOduration=168.593315109 podStartE2EDuration="2m48.593315109s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:04:32.592622121 +0000 UTC m=+169.199524228" watchObservedRunningTime="2026-04-21 16:04:32.593315109 +0000 UTC m=+169.200217194" Apr 21 16:04:34.051360 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:34.051323 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:04:36.014227 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.014197 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-k7rdr"] Apr 21 16:04:36.018515 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.018499 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.033993 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.033967 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 16:04:36.034970 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.034952 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 16:04:36.035056 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:04:36.035031 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-129-96.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-129-96.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 21 16:04:36.084313 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.084278 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 16:04:36.084469 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.084334 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 16:04:36.100201 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.100173 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 16:04:36.100429 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.100411 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mt6fm\"" Apr 21 16:04:36.183171 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.183136 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgk2\" (UniqueName: \"kubernetes.io/projected/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-kube-api-access-gkgk2\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.183171 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.183170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-textfile\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.183378 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.183194 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-tls\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.183378 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.183300 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-root\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.183378 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.183332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-metrics-client-ca\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.183378 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.183356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-sys\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.183503 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.183381 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.183503 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.183421 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-accelerators-collector-config\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.183503 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.183442 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-wtmp\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284224 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-root\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284224 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284172 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-metrics-client-ca\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284424 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-root\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284424 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284281 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-sys\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284424 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284424 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284328 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-accelerators-collector-config\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284424 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284354 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-wtmp\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284424 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-sys\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284424 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284395 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgk2\" (UniqueName: \"kubernetes.io/projected/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-kube-api-access-gkgk2\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284751 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-textfile\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284751 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284474 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-tls\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284751 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284553 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-wtmp\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.284751 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:04:36.284634 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 16:04:36.284751 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:04:36.284713 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-tls podName:d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:36.784694416 +0000 UTC m=+173.391596525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-tls") pod "node-exporter-k7rdr" (UID: "d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6") : secret "node-exporter-tls" not found Apr 21 16:04:36.284751 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284729 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-metrics-client-ca\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.285055 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284815 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-textfile\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.285055 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.284903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-accelerators-collector-config\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.286629 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.286612 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.552536 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.552461 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mp77f" Apr 21 16:04:36.788497 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.788457 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-tls\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:36.790887 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:36.790858 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-node-exporter-tls\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:37.297494 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:04:37.297463 2577 projected.go:289] Couldn't get configMap openshift-monitoring/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 21 16:04:37.297494 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:04:37.297487 2577 projected.go:194] Error preparing data for projected volume kube-api-access-gkgk2 for pod openshift-monitoring/node-exporter-k7rdr: failed to sync configmap cache: timed out waiting for the condition Apr 21 16:04:37.297896 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:04:37.297541 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-kube-api-access-gkgk2 podName:d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6 nodeName:}" failed. No retries permitted until 2026-04-21 16:04:37.797526044 +0000 UTC m=+174.404428130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gkgk2" (UniqueName: "kubernetes.io/projected/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-kube-api-access-gkgk2") pod "node-exporter-k7rdr" (UID: "d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6") : failed to sync configmap cache: timed out waiting for the condition Apr 21 16:04:37.621100 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:37.621070 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 16:04:37.897077 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:37.896980 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgk2\" (UniqueName: \"kubernetes.io/projected/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-kube-api-access-gkgk2\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:37.899361 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:37.899338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgk2\" (UniqueName: \"kubernetes.io/projected/d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6-kube-api-access-gkgk2\") pod \"node-exporter-k7rdr\" (UID: \"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6\") " pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:38.127083 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:38.127036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-k7rdr" Apr 21 16:04:38.134990 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:04:38.134959 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd927e11d_e8c5_4b2b_bdf5_e25a34f37ac6.slice/crio-5453428c6a42dfd58dddc064be0630626e51c9e5705f896c58ec23259a68d7c2 WatchSource:0}: Error finding container 5453428c6a42dfd58dddc064be0630626e51c9e5705f896c58ec23259a68d7c2: Status 404 returned error can't find the container with id 5453428c6a42dfd58dddc064be0630626e51c9e5705f896c58ec23259a68d7c2 Apr 21 16:04:38.582715 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:38.582684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k7rdr" event={"ID":"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6","Type":"ContainerStarted","Data":"5453428c6a42dfd58dddc064be0630626e51c9e5705f896c58ec23259a68d7c2"} Apr 21 16:04:39.590164 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:39.590128 2577 generic.go:358] "Generic (PLEG): container finished" podID="d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6" containerID="3e00d90f48d52c963e3d0052b41e2d5aa24ec781a29c6bdfb96c380517305471" exitCode=0 Apr 21 16:04:39.590540 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:39.590180 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k7rdr" event={"ID":"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6","Type":"ContainerDied","Data":"3e00d90f48d52c963e3d0052b41e2d5aa24ec781a29c6bdfb96c380517305471"} Apr 21 16:04:40.594259 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:40.594227 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k7rdr" event={"ID":"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6","Type":"ContainerStarted","Data":"f0a400ddf541930a76950ba90103e4e73337ad0e88e4830c0462a5cb4bc69d33"} Apr 21 16:04:40.594259 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:40.594260 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-k7rdr" event={"ID":"d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6","Type":"ContainerStarted","Data":"9133723245b5535762b9929f1bac69528703b9083baf96a9a37f0f8d16d239de"} Apr 21 16:04:40.632367 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:40.632317 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-k7rdr" podStartSLOduration=4.862412904 podStartE2EDuration="5.632303455s" podCreationTimestamp="2026-04-21 16:04:35 +0000 UTC" firstStartedPulling="2026-04-21 16:04:38.136836031 +0000 UTC m=+174.743738119" lastFinishedPulling="2026-04-21 16:04:38.906726585 +0000 UTC m=+175.513628670" observedRunningTime="2026-04-21 16:04:40.628717209 +0000 UTC m=+177.235619328" watchObservedRunningTime="2026-04-21 16:04:40.632303455 +0000 UTC m=+177.239205604" Apr 21 16:04:49.470393 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:49.470355 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-685b9dc9fd-l8klc"] Apr 21 16:04:49.474282 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:04:49.474259 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:05:06.028959 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:06.028914 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" podUID="67e3a2b6-e556-409b-98e0-59a2af74476f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 16:05:14.489403 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.489358 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" podUID="d9517991-a0f5-431a-9763-7400c577640c" containerName="registry" containerID="cri-o://33dc801dc9b6a834133d9d6a055726f9f9266e10b186f7f746b754df7ca652ab" gracePeriod=30 Apr 21 16:05:14.676953 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.676914 2577 generic.go:358] "Generic (PLEG): container finished" podID="d9517991-a0f5-431a-9763-7400c577640c" containerID="33dc801dc9b6a834133d9d6a055726f9f9266e10b186f7f746b754df7ca652ab" exitCode=0 Apr 21 16:05:14.677114 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.676987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" event={"ID":"d9517991-a0f5-431a-9763-7400c577640c","Type":"ContainerDied","Data":"33dc801dc9b6a834133d9d6a055726f9f9266e10b186f7f746b754df7ca652ab"} Apr 21 16:05:14.726881 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.726856 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:05:14.765678 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.765610 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttnzs\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-kube-api-access-ttnzs\") pod \"d9517991-a0f5-431a-9763-7400c577640c\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " Apr 21 16:05:14.765678 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.765655 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-image-registry-private-configuration\") pod \"d9517991-a0f5-431a-9763-7400c577640c\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " Apr 21 16:05:14.765901 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.765688 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-bound-sa-token\") pod \"d9517991-a0f5-431a-9763-7400c577640c\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " Apr 21 16:05:14.765901 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.765839 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9517991-a0f5-431a-9763-7400c577640c-ca-trust-extracted\") pod \"d9517991-a0f5-431a-9763-7400c577640c\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " Apr 21 16:05:14.765901 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.765882 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-registry-certificates\") pod \"d9517991-a0f5-431a-9763-7400c577640c\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " Apr 21 16:05:14.766057 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.765934 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") pod \"d9517991-a0f5-431a-9763-7400c577640c\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " Apr 21 16:05:14.766057 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.765977 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-installation-pull-secrets\") pod \"d9517991-a0f5-431a-9763-7400c577640c\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " Apr 21 16:05:14.766057 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.766004 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-trusted-ca\") pod \"d9517991-a0f5-431a-9763-7400c577640c\" (UID: \"d9517991-a0f5-431a-9763-7400c577640c\") " Apr 21 16:05:14.766570 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.766508 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d9517991-a0f5-431a-9763-7400c577640c" (UID: "d9517991-a0f5-431a-9763-7400c577640c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:14.766680 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.766594 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d9517991-a0f5-431a-9763-7400c577640c" (UID: "d9517991-a0f5-431a-9763-7400c577640c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 16:05:14.768677 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.768632 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "d9517991-a0f5-431a-9763-7400c577640c" (UID: "d9517991-a0f5-431a-9763-7400c577640c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:14.768805 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.768752 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d9517991-a0f5-431a-9763-7400c577640c" (UID: "d9517991-a0f5-431a-9763-7400c577640c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:14.768959 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.768933 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-kube-api-access-ttnzs" (OuterVolumeSpecName: "kube-api-access-ttnzs") pod "d9517991-a0f5-431a-9763-7400c577640c" (UID: "d9517991-a0f5-431a-9763-7400c577640c"). InnerVolumeSpecName "kube-api-access-ttnzs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:14.769027 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.768997 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d9517991-a0f5-431a-9763-7400c577640c" (UID: "d9517991-a0f5-431a-9763-7400c577640c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:05:14.769083 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.769069 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d9517991-a0f5-431a-9763-7400c577640c" (UID: "d9517991-a0f5-431a-9763-7400c577640c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 16:05:14.775182 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.775154 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9517991-a0f5-431a-9763-7400c577640c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d9517991-a0f5-431a-9763-7400c577640c" (UID: "d9517991-a0f5-431a-9763-7400c577640c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:05:14.867180 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.867136 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9517991-a0f5-431a-9763-7400c577640c-ca-trust-extracted\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:05:14.867180 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.867176 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-registry-certificates\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:05:14.867180 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.867187 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-registry-tls\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:05:14.867411 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.867197 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-installation-pull-secrets\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:05:14.867411 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.867207 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9517991-a0f5-431a-9763-7400c577640c-trusted-ca\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:05:14.867411 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.867216 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ttnzs\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-kube-api-access-ttnzs\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:05:14.867411 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.867226 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d9517991-a0f5-431a-9763-7400c577640c-image-registry-private-configuration\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:05:14.867411 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:14.867236 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9517991-a0f5-431a-9763-7400c577640c-bound-sa-token\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:05:15.680861 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:15.680818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" event={"ID":"d9517991-a0f5-431a-9763-7400c577640c","Type":"ContainerDied","Data":"dfc07c1290fd64911797ff5b069bab0d8049c8d26d4c02130d355e7224bad15d"} Apr 21 16:05:15.680861 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:15.680849 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-685b9dc9fd-l8klc" Apr 21 16:05:15.681331 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:15.680872 2577 scope.go:117] "RemoveContainer" containerID="33dc801dc9b6a834133d9d6a055726f9f9266e10b186f7f746b754df7ca652ab" Apr 21 16:05:15.703881 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:15.703854 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-685b9dc9fd-l8klc"] Apr 21 16:05:15.708229 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:15.708205 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-685b9dc9fd-l8klc"] Apr 21 16:05:16.028850 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:16.028751 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" podUID="67e3a2b6-e556-409b-98e0-59a2af74476f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 16:05:16.053728 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:16.053698 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9517991-a0f5-431a-9763-7400c577640c" path="/var/lib/kubelet/pods/d9517991-a0f5-431a-9763-7400c577640c/volumes" Apr 21 16:05:26.028911 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:26.028870 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" podUID="67e3a2b6-e556-409b-98e0-59a2af74476f" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 21 16:05:26.029286 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:26.028940 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" Apr 21 16:05:26.029404 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:26.029374 2577 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"08857133fd9133ba1bbdd695d4bb90ed1af0221791797a89b05731b3dc7535b7"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 21 16:05:26.029452 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:26.029421 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" podUID="67e3a2b6-e556-409b-98e0-59a2af74476f" containerName="service-proxy" containerID="cri-o://08857133fd9133ba1bbdd695d4bb90ed1af0221791797a89b05731b3dc7535b7" gracePeriod=30 Apr 21 16:05:26.709799 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:26.709739 2577 generic.go:358] "Generic (PLEG): container finished" podID="67e3a2b6-e556-409b-98e0-59a2af74476f" containerID="08857133fd9133ba1bbdd695d4bb90ed1af0221791797a89b05731b3dc7535b7" exitCode=2 Apr 21 16:05:26.709964 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:26.709806 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" event={"ID":"67e3a2b6-e556-409b-98e0-59a2af74476f","Type":"ContainerDied","Data":"08857133fd9133ba1bbdd695d4bb90ed1af0221791797a89b05731b3dc7535b7"} Apr 21 16:05:26.709964 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:26.709837 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-67dc99f99-9jzzs" event={"ID":"67e3a2b6-e556-409b-98e0-59a2af74476f","Type":"ContainerStarted","Data":"a551770bf71381e30d71f93e5c66f622c6d926cac85d77e620fea276777c62bc"} Apr 21 16:05:55.870577 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:55.870540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:05:55.872853 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:55.872832 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c538f88-ee19-454d-9d5b-09dd0a3ed71b-metrics-certs\") pod \"network-metrics-daemon-qhl4l\" (UID: \"4c538f88-ee19-454d-9d5b-09dd0a3ed71b\") " pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:05:55.954226 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:55.954195 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cz78x\"" Apr 21 16:05:55.962189 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:55.962172 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhl4l" Apr 21 16:05:56.078442 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:56.078412 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qhl4l"] Apr 21 16:05:56.081227 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:05:56.081200 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c538f88_ee19_454d_9d5b_09dd0a3ed71b.slice/crio-88683ebc5447949c3afc6319da85887aa3304a992764e58ac64c6f766b1bfe1b WatchSource:0}: Error finding container 88683ebc5447949c3afc6319da85887aa3304a992764e58ac64c6f766b1bfe1b: Status 404 returned error can't find the container with id 88683ebc5447949c3afc6319da85887aa3304a992764e58ac64c6f766b1bfe1b Apr 21 16:05:56.793034 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:56.787991 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qhl4l" event={"ID":"4c538f88-ee19-454d-9d5b-09dd0a3ed71b","Type":"ContainerStarted","Data":"88683ebc5447949c3afc6319da85887aa3304a992764e58ac64c6f766b1bfe1b"} Apr 21 16:05:57.792790 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:57.792728 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qhl4l" event={"ID":"4c538f88-ee19-454d-9d5b-09dd0a3ed71b","Type":"ContainerStarted","Data":"278190516e95ac1d3d33c62dbba498eb81e3b171189a5e9955737d293c0db080"} Apr 21 16:05:57.792790 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:57.792795 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qhl4l" event={"ID":"4c538f88-ee19-454d-9d5b-09dd0a3ed71b","Type":"ContainerStarted","Data":"20ccae46a5deb65acdbb260d98a7aef6104edfe2f6bef5b7da2afbb0731680d9"} Apr 21 16:05:57.814414 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:05:57.814363 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qhl4l" podStartSLOduration=252.841493347 podStartE2EDuration="4m13.814348703s" podCreationTimestamp="2026-04-21 16:01:44 +0000 UTC" firstStartedPulling="2026-04-21 16:05:56.082990432 +0000 UTC m=+252.689892517" lastFinishedPulling="2026-04-21 16:05:57.055845784 +0000 UTC m=+253.662747873" observedRunningTime="2026-04-21 16:05:57.813018662 +0000 UTC m=+254.419920769" watchObservedRunningTime="2026-04-21 16:05:57.814348703 +0000 UTC m=+254.421250811" Apr 21 16:06:43.968072 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:06:43.968043 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:06:43.968601 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:06:43.968043 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:06:43.970829 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:06:43.970808 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 16:08:59.781307 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.781275 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-7c9c2"] Apr 21 16:08:59.781829 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.781491 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9517991-a0f5-431a-9763-7400c577640c" containerName="registry" Apr 21 16:08:59.781829 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.781501 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9517991-a0f5-431a-9763-7400c577640c" containerName="registry" Apr 21 16:08:59.781829 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.781545 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9517991-a0f5-431a-9763-7400c577640c" containerName="registry" Apr 21 16:08:59.784227 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.784211 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-7c9c2" Apr 21 16:08:59.786947 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.786922 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 21 16:08:59.787873 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.787842 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-nndjr\"" Apr 21 16:08:59.787873 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.787858 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 21 16:08:59.792227 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.792204 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-7c9c2"] Apr 21 16:08:59.805359 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.805338 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sch5h\" (UniqueName: \"kubernetes.io/projected/6ec35247-b77e-43ba-8190-726ba77f1182-kube-api-access-sch5h\") pod \"cert-manager-79c8d999ff-7c9c2\" (UID: \"6ec35247-b77e-43ba-8190-726ba77f1182\") " pod="cert-manager/cert-manager-79c8d999ff-7c9c2" Apr 21 16:08:59.805475 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.805371 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ec35247-b77e-43ba-8190-726ba77f1182-bound-sa-token\") pod \"cert-manager-79c8d999ff-7c9c2\" (UID: \"6ec35247-b77e-43ba-8190-726ba77f1182\") " pod="cert-manager/cert-manager-79c8d999ff-7c9c2" Apr 21 16:08:59.906221 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.906180 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sch5h\" (UniqueName: \"kubernetes.io/projected/6ec35247-b77e-43ba-8190-726ba77f1182-kube-api-access-sch5h\") pod \"cert-manager-79c8d999ff-7c9c2\" (UID: \"6ec35247-b77e-43ba-8190-726ba77f1182\") " pod="cert-manager/cert-manager-79c8d999ff-7c9c2" Apr 21 16:08:59.906221 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.906229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ec35247-b77e-43ba-8190-726ba77f1182-bound-sa-token\") pod \"cert-manager-79c8d999ff-7c9c2\" (UID: \"6ec35247-b77e-43ba-8190-726ba77f1182\") " pod="cert-manager/cert-manager-79c8d999ff-7c9c2" Apr 21 16:08:59.914201 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.914169 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6ec35247-b77e-43ba-8190-726ba77f1182-bound-sa-token\") pod \"cert-manager-79c8d999ff-7c9c2\" (UID: \"6ec35247-b77e-43ba-8190-726ba77f1182\") " pod="cert-manager/cert-manager-79c8d999ff-7c9c2" Apr 21 16:08:59.914320 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:08:59.914225 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sch5h\" (UniqueName: \"kubernetes.io/projected/6ec35247-b77e-43ba-8190-726ba77f1182-kube-api-access-sch5h\") pod \"cert-manager-79c8d999ff-7c9c2\" (UID: \"6ec35247-b77e-43ba-8190-726ba77f1182\") " pod="cert-manager/cert-manager-79c8d999ff-7c9c2" Apr 21 16:09:00.094095 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:00.094057 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-7c9c2" Apr 21 16:09:00.211951 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:00.211919 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-7c9c2"] Apr 21 16:09:00.214888 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:09:00.214857 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ec35247_b77e_43ba_8190_726ba77f1182.slice/crio-37daa1db8efde0ce444ea9815256e62451b0d1de1b5f3eb635cff72374fe13b8 WatchSource:0}: Error finding container 37daa1db8efde0ce444ea9815256e62451b0d1de1b5f3eb635cff72374fe13b8: Status 404 returned error can't find the container with id 37daa1db8efde0ce444ea9815256e62451b0d1de1b5f3eb635cff72374fe13b8 Apr 21 16:09:00.216634 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:00.216619 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:09:00.241387 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:00.241357 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-7c9c2" event={"ID":"6ec35247-b77e-43ba-8190-726ba77f1182","Type":"ContainerStarted","Data":"37daa1db8efde0ce444ea9815256e62451b0d1de1b5f3eb635cff72374fe13b8"} Apr 21 16:09:01.779471 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.779398 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4"] Apr 21 16:09:01.781964 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.781941 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" Apr 21 16:09:01.784277 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.784251 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 16:09:01.785230 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.785210 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 16:09:01.785342 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.785213 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-qtsj8\"" Apr 21 16:09:01.791503 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.791461 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4"] Apr 21 16:09:01.821089 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.821060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/af3c6353-f0c5-4a22-80bb-2e0a71253452-tmp\") pod \"openshift-lws-operator-bfc7f696d-8lss4\" (UID: \"af3c6353-f0c5-4a22-80bb-2e0a71253452\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" Apr 21 16:09:01.821269 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.821146 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd8nl\" (UniqueName: \"kubernetes.io/projected/af3c6353-f0c5-4a22-80bb-2e0a71253452-kube-api-access-cd8nl\") pod \"openshift-lws-operator-bfc7f696d-8lss4\" (UID: \"af3c6353-f0c5-4a22-80bb-2e0a71253452\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" Apr 21 16:09:01.922226 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.922194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd8nl\" (UniqueName: \"kubernetes.io/projected/af3c6353-f0c5-4a22-80bb-2e0a71253452-kube-api-access-cd8nl\") pod \"openshift-lws-operator-bfc7f696d-8lss4\" (UID: \"af3c6353-f0c5-4a22-80bb-2e0a71253452\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" Apr 21 16:09:01.922412 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.922267 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/af3c6353-f0c5-4a22-80bb-2e0a71253452-tmp\") pod \"openshift-lws-operator-bfc7f696d-8lss4\" (UID: \"af3c6353-f0c5-4a22-80bb-2e0a71253452\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" Apr 21 16:09:01.922639 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.922619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/af3c6353-f0c5-4a22-80bb-2e0a71253452-tmp\") pod \"openshift-lws-operator-bfc7f696d-8lss4\" (UID: \"af3c6353-f0c5-4a22-80bb-2e0a71253452\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" Apr 21 16:09:01.930927 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:01.930900 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd8nl\" (UniqueName: \"kubernetes.io/projected/af3c6353-f0c5-4a22-80bb-2e0a71253452-kube-api-access-cd8nl\") pod \"openshift-lws-operator-bfc7f696d-8lss4\" (UID: \"af3c6353-f0c5-4a22-80bb-2e0a71253452\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" Apr 21 16:09:02.095160 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:02.095124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" Apr 21 16:09:03.462671 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:03.462641 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4"] Apr 21 16:09:03.465797 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:09:03.465760 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf3c6353_f0c5_4a22_80bb_2e0a71253452.slice/crio-d97cf290fa4941d4f1a034da95fb645275d13a8c982db55773894917ea12ba8b WatchSource:0}: Error finding container d97cf290fa4941d4f1a034da95fb645275d13a8c982db55773894917ea12ba8b: Status 404 returned error can't find the container with id d97cf290fa4941d4f1a034da95fb645275d13a8c982db55773894917ea12ba8b Apr 21 16:09:04.258315 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:04.258105 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-7c9c2" event={"ID":"6ec35247-b77e-43ba-8190-726ba77f1182","Type":"ContainerStarted","Data":"83b18d0f50fc6e43247773b5fec45781b86e7eeef7c34ba4020d196c0a9ac37c"} Apr 21 16:09:04.259629 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:04.259594 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" event={"ID":"af3c6353-f0c5-4a22-80bb-2e0a71253452","Type":"ContainerStarted","Data":"d97cf290fa4941d4f1a034da95fb645275d13a8c982db55773894917ea12ba8b"} Apr 21 16:09:04.284883 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:04.284798 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-7c9c2" podStartSLOduration=2.110594663 podStartE2EDuration="5.284763201s" podCreationTimestamp="2026-04-21 16:08:59 +0000 UTC" firstStartedPulling="2026-04-21 16:09:00.21674415 +0000 UTC m=+436.823646234" lastFinishedPulling="2026-04-21 16:09:03.390912672 +0000 UTC m=+439.997814772" observedRunningTime="2026-04-21 16:09:04.277245297 +0000 UTC m=+440.884147405" watchObservedRunningTime="2026-04-21 16:09:04.284763201 +0000 UTC m=+440.891665307" Apr 21 16:09:06.266481 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:06.266391 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" event={"ID":"af3c6353-f0c5-4a22-80bb-2e0a71253452","Type":"ContainerStarted","Data":"3bc0bf21073df681b924643d08ff57977e2b574422b561a2c02bab3269257b48"} Apr 21 16:09:06.292684 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:06.292634 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-8lss4" podStartSLOduration=2.778896376 podStartE2EDuration="5.292615513s" podCreationTimestamp="2026-04-21 16:09:01 +0000 UTC" firstStartedPulling="2026-04-21 16:09:03.467405176 +0000 UTC m=+440.074307261" lastFinishedPulling="2026-04-21 16:09:05.98112431 +0000 UTC m=+442.588026398" observedRunningTime="2026-04-21 16:09:06.292057351 +0000 UTC m=+442.898959458" watchObservedRunningTime="2026-04-21 16:09:06.292615513 +0000 UTC m=+442.899517620" Apr 21 16:09:25.270971 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.270933 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm"] Apr 21 16:09:25.273276 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.273255 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.275702 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.275679 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 16:09:25.276024 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.276004 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 16:09:25.276149 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.276094 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 16:09:25.276356 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.276337 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 16:09:25.276525 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.276508 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-5mxtt\"" Apr 21 16:09:25.289807 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.289757 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm"] Apr 21 16:09:25.387844 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.387804 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26789298-0cab-4321-846d-66101be60413-apiservice-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-st7dm\" (UID: \"26789298-0cab-4321-846d-66101be60413\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.387844 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.387854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26789298-0cab-4321-846d-66101be60413-webhook-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-st7dm\" (UID: \"26789298-0cab-4321-846d-66101be60413\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.388089 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.387951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jksk\" (UniqueName: \"kubernetes.io/projected/26789298-0cab-4321-846d-66101be60413-kube-api-access-4jksk\") pod \"opendatahub-operator-controller-manager-774f54dc87-st7dm\" (UID: \"26789298-0cab-4321-846d-66101be60413\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.489276 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.489242 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26789298-0cab-4321-846d-66101be60413-apiservice-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-st7dm\" (UID: \"26789298-0cab-4321-846d-66101be60413\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.489276 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.489284 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26789298-0cab-4321-846d-66101be60413-webhook-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-st7dm\" (UID: \"26789298-0cab-4321-846d-66101be60413\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.489529 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.489316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jksk\" (UniqueName: \"kubernetes.io/projected/26789298-0cab-4321-846d-66101be60413-kube-api-access-4jksk\") pod \"opendatahub-operator-controller-manager-774f54dc87-st7dm\" (UID: \"26789298-0cab-4321-846d-66101be60413\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.491724 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.491700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/26789298-0cab-4321-846d-66101be60413-webhook-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-st7dm\" (UID: \"26789298-0cab-4321-846d-66101be60413\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.491850 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.491741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/26789298-0cab-4321-846d-66101be60413-apiservice-cert\") pod \"opendatahub-operator-controller-manager-774f54dc87-st7dm\" (UID: \"26789298-0cab-4321-846d-66101be60413\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.518076 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.518049 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jksk\" (UniqueName: \"kubernetes.io/projected/26789298-0cab-4321-846d-66101be60413-kube-api-access-4jksk\") pod \"opendatahub-operator-controller-manager-774f54dc87-st7dm\" (UID: \"26789298-0cab-4321-846d-66101be60413\") " pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.584211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.584182 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:25.725842 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:25.725818 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm"] Apr 21 16:09:25.728953 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:09:25.728923 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26789298_0cab_4321_846d_66101be60413.slice/crio-c05c7934df5c1d66da01fe513a442c83a0a118195c3223be11055381111f6b4c WatchSource:0}: Error finding container c05c7934df5c1d66da01fe513a442c83a0a118195c3223be11055381111f6b4c: Status 404 returned error can't find the container with id c05c7934df5c1d66da01fe513a442c83a0a118195c3223be11055381111f6b4c Apr 21 16:09:26.323625 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:26.323586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" event={"ID":"26789298-0cab-4321-846d-66101be60413","Type":"ContainerStarted","Data":"c05c7934df5c1d66da01fe513a442c83a0a118195c3223be11055381111f6b4c"} Apr 21 16:09:28.331193 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:28.331160 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" event={"ID":"26789298-0cab-4321-846d-66101be60413","Type":"ContainerStarted","Data":"f5e6e94b4e7f1efa0ca32577e03416f979914ccb1f2d08763de335f1f08ecaa4"} Apr 21 16:09:28.331609 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:28.331287 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:28.374803 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:28.374727 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" podStartSLOduration=0.86339168 podStartE2EDuration="3.37471296s" podCreationTimestamp="2026-04-21 16:09:25 +0000 UTC" firstStartedPulling="2026-04-21 16:09:25.730673678 +0000 UTC m=+462.337575763" lastFinishedPulling="2026-04-21 16:09:28.241994955 +0000 UTC m=+464.848897043" observedRunningTime="2026-04-21 16:09:28.373735419 +0000 UTC m=+464.980637528" watchObservedRunningTime="2026-04-21 16:09:28.37471296 +0000 UTC m=+464.981615058" Apr 21 16:09:39.336473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:39.336436 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-774f54dc87-st7dm" Apr 21 16:09:44.866011 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:44.865973 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm"] Apr 21 16:09:44.876885 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:44.876862 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:44.879953 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:44.879864 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 16:09:44.880461 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:44.880428 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 16:09:44.880576 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:44.880519 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-97p9h\"" Apr 21 16:09:44.880640 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:44.880593 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 16:09:44.880692 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:44.880637 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 16:09:44.880754 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:44.880732 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm"] Apr 21 16:09:45.034369 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.034328 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45c5d172-f16d-4ebd-9d96-3a8ea68e126c-tls-certs\") pod \"kube-auth-proxy-5598cc66fd-rczdm\" (UID: \"45c5d172-f16d-4ebd-9d96-3a8ea68e126c\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:45.034567 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.034385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45c5d172-f16d-4ebd-9d96-3a8ea68e126c-tmp\") pod \"kube-auth-proxy-5598cc66fd-rczdm\" (UID: \"45c5d172-f16d-4ebd-9d96-3a8ea68e126c\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:45.034567 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.034413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shwtp\" (UniqueName: \"kubernetes.io/projected/45c5d172-f16d-4ebd-9d96-3a8ea68e126c-kube-api-access-shwtp\") pod \"kube-auth-proxy-5598cc66fd-rczdm\" (UID: \"45c5d172-f16d-4ebd-9d96-3a8ea68e126c\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:45.135436 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.135344 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45c5d172-f16d-4ebd-9d96-3a8ea68e126c-tls-certs\") pod \"kube-auth-proxy-5598cc66fd-rczdm\" (UID: \"45c5d172-f16d-4ebd-9d96-3a8ea68e126c\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:45.135436 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.135394 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45c5d172-f16d-4ebd-9d96-3a8ea68e126c-tmp\") pod \"kube-auth-proxy-5598cc66fd-rczdm\" (UID: \"45c5d172-f16d-4ebd-9d96-3a8ea68e126c\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:45.135436 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.135416 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shwtp\" (UniqueName: \"kubernetes.io/projected/45c5d172-f16d-4ebd-9d96-3a8ea68e126c-kube-api-access-shwtp\") pod \"kube-auth-proxy-5598cc66fd-rczdm\" (UID: \"45c5d172-f16d-4ebd-9d96-3a8ea68e126c\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:45.137633 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.137607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/45c5d172-f16d-4ebd-9d96-3a8ea68e126c-tmp\") pod \"kube-auth-proxy-5598cc66fd-rczdm\" (UID: \"45c5d172-f16d-4ebd-9d96-3a8ea68e126c\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:45.137801 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.137765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/45c5d172-f16d-4ebd-9d96-3a8ea68e126c-tls-certs\") pod \"kube-auth-proxy-5598cc66fd-rczdm\" (UID: \"45c5d172-f16d-4ebd-9d96-3a8ea68e126c\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:45.145154 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.145127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shwtp\" (UniqueName: \"kubernetes.io/projected/45c5d172-f16d-4ebd-9d96-3a8ea68e126c-kube-api-access-shwtp\") pod \"kube-auth-proxy-5598cc66fd-rczdm\" (UID: \"45c5d172-f16d-4ebd-9d96-3a8ea68e126c\") " pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:45.187581 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.187556 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" Apr 21 16:09:45.306048 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.306018 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm"] Apr 21 16:09:45.309250 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:09:45.309218 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c5d172_f16d_4ebd_9d96_3a8ea68e126c.slice/crio-43930421d175cd8b3ffcb978e2b8fea5adf26ae0640c78d9214484f501586db7 WatchSource:0}: Error finding container 43930421d175cd8b3ffcb978e2b8fea5adf26ae0640c78d9214484f501586db7: Status 404 returned error can't find the container with id 43930421d175cd8b3ffcb978e2b8fea5adf26ae0640c78d9214484f501586db7 Apr 21 16:09:45.377416 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:45.377383 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" event={"ID":"45c5d172-f16d-4ebd-9d96-3a8ea68e126c","Type":"ContainerStarted","Data":"43930421d175cd8b3ffcb978e2b8fea5adf26ae0640c78d9214484f501586db7"} Apr 21 16:09:46.857387 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:46.857356 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-bw97v"] Apr 21 16:09:46.865270 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:46.865245 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:09:46.868214 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:46.868185 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-cbmht\"" Apr 21 16:09:46.869348 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:46.869155 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 21 16:09:46.871063 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:46.871040 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-bw97v"] Apr 21 16:09:46.952105 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:46.952071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2423b3fc-41cc-40e4-956e-9f52f63198d3-cert\") pod \"odh-model-controller-858dbf95b8-bw97v\" (UID: \"2423b3fc-41cc-40e4-956e-9f52f63198d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:09:46.952328 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:46.952125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmm49\" (UniqueName: \"kubernetes.io/projected/2423b3fc-41cc-40e4-956e-9f52f63198d3-kube-api-access-fmm49\") pod \"odh-model-controller-858dbf95b8-bw97v\" (UID: \"2423b3fc-41cc-40e4-956e-9f52f63198d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:09:47.053230 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:47.053194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2423b3fc-41cc-40e4-956e-9f52f63198d3-cert\") pod \"odh-model-controller-858dbf95b8-bw97v\" (UID: \"2423b3fc-41cc-40e4-956e-9f52f63198d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:09:47.053401 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:47.053264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmm49\" (UniqueName: \"kubernetes.io/projected/2423b3fc-41cc-40e4-956e-9f52f63198d3-kube-api-access-fmm49\") pod \"odh-model-controller-858dbf95b8-bw97v\" (UID: \"2423b3fc-41cc-40e4-956e-9f52f63198d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:09:47.053401 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:09:47.053356 2577 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 21 16:09:47.053506 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:09:47.053432 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2423b3fc-41cc-40e4-956e-9f52f63198d3-cert podName:2423b3fc-41cc-40e4-956e-9f52f63198d3 nodeName:}" failed. No retries permitted until 2026-04-21 16:09:47.553409967 +0000 UTC m=+484.160312060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2423b3fc-41cc-40e4-956e-9f52f63198d3-cert") pod "odh-model-controller-858dbf95b8-bw97v" (UID: "2423b3fc-41cc-40e4-956e-9f52f63198d3") : secret "odh-model-controller-webhook-cert" not found Apr 21 16:09:47.072298 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:47.072260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmm49\" (UniqueName: \"kubernetes.io/projected/2423b3fc-41cc-40e4-956e-9f52f63198d3-kube-api-access-fmm49\") pod \"odh-model-controller-858dbf95b8-bw97v\" (UID: \"2423b3fc-41cc-40e4-956e-9f52f63198d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:09:47.559211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:47.559164 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2423b3fc-41cc-40e4-956e-9f52f63198d3-cert\") pod \"odh-model-controller-858dbf95b8-bw97v\" (UID: \"2423b3fc-41cc-40e4-956e-9f52f63198d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:09:47.562102 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:47.562077 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2423b3fc-41cc-40e4-956e-9f52f63198d3-cert\") pod \"odh-model-controller-858dbf95b8-bw97v\" (UID: \"2423b3fc-41cc-40e4-956e-9f52f63198d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:09:47.777310 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:47.777272 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:09:47.897563 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:47.897533 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-bw97v"] Apr 21 16:09:47.900907 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:09:47.900882 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2423b3fc_41cc_40e4_956e_9f52f63198d3.slice/crio-7d3c5f79fc5ed28829325103749f920f1fb654b216614b877f52f1b4a6ee1664 WatchSource:0}: Error finding container 7d3c5f79fc5ed28829325103749f920f1fb654b216614b877f52f1b4a6ee1664: Status 404 returned error can't find the container with id 7d3c5f79fc5ed28829325103749f920f1fb654b216614b877f52f1b4a6ee1664 Apr 21 16:09:48.386312 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:48.386276 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" event={"ID":"2423b3fc-41cc-40e4-956e-9f52f63198d3","Type":"ContainerStarted","Data":"7d3c5f79fc5ed28829325103749f920f1fb654b216614b877f52f1b4a6ee1664"} Apr 21 16:09:50.392621 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:50.392586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" event={"ID":"45c5d172-f16d-4ebd-9d96-3a8ea68e126c","Type":"ContainerStarted","Data":"8c491fc88ab5320656dd7b67dfffd018ae6ea328cde229ab3442a92f8694687b"} Apr 21 16:09:50.410813 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:50.410734 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5598cc66fd-rczdm" podStartSLOduration=1.710867596 podStartE2EDuration="6.410700019s" podCreationTimestamp="2026-04-21 16:09:44 +0000 UTC" firstStartedPulling="2026-04-21 16:09:45.311132618 +0000 UTC m=+481.918034721" lastFinishedPulling="2026-04-21 16:09:50.010965057 +0000 UTC m=+486.617867144" observedRunningTime="2026-04-21 16:09:50.40990945 +0000 UTC m=+487.016811559" watchObservedRunningTime="2026-04-21 16:09:50.410700019 +0000 UTC m=+487.017602127" Apr 21 16:09:51.396619 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:51.396585 2577 generic.go:358] "Generic (PLEG): container finished" podID="2423b3fc-41cc-40e4-956e-9f52f63198d3" containerID="14f932692d5877bf84a644112a60e8347f6fdb644bfc070f638603cac561114f" exitCode=1 Apr 21 16:09:51.397118 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:51.396677 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" event={"ID":"2423b3fc-41cc-40e4-956e-9f52f63198d3","Type":"ContainerDied","Data":"14f932692d5877bf84a644112a60e8347f6fdb644bfc070f638603cac561114f"} Apr 21 16:09:51.397118 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:51.396861 2577 scope.go:117] "RemoveContainer" containerID="14f932692d5877bf84a644112a60e8347f6fdb644bfc070f638603cac561114f" Apr 21 16:09:52.401374 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:52.401340 2577 generic.go:358] "Generic (PLEG): container finished" podID="2423b3fc-41cc-40e4-956e-9f52f63198d3" containerID="4c1c9b51e27e83aeb4fdc4d21d519df973628fe3f83b96146741c8d90dad75f4" exitCode=1 Apr 21 16:09:52.401809 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:52.401388 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" event={"ID":"2423b3fc-41cc-40e4-956e-9f52f63198d3","Type":"ContainerDied","Data":"4c1c9b51e27e83aeb4fdc4d21d519df973628fe3f83b96146741c8d90dad75f4"} Apr 21 16:09:52.401809 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:52.401414 2577 scope.go:117] "RemoveContainer" containerID="14f932692d5877bf84a644112a60e8347f6fdb644bfc070f638603cac561114f" Apr 21 16:09:52.401809 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:52.401602 2577 scope.go:117] "RemoveContainer" containerID="4c1c9b51e27e83aeb4fdc4d21d519df973628fe3f83b96146741c8d90dad75f4" Apr 21 16:09:52.401809 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:09:52.401805 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-bw97v_opendatahub(2423b3fc-41cc-40e4-956e-9f52f63198d3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" podUID="2423b3fc-41cc-40e4-956e-9f52f63198d3" Apr 21 16:09:53.405596 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:53.405564 2577 scope.go:117] "RemoveContainer" containerID="4c1c9b51e27e83aeb4fdc4d21d519df973628fe3f83b96146741c8d90dad75f4" Apr 21 16:09:53.405997 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:09:53.405730 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-bw97v_opendatahub(2423b3fc-41cc-40e4-956e-9f52f63198d3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" podUID="2423b3fc-41cc-40e4-956e-9f52f63198d3" Apr 21 16:09:55.253577 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:55.253544 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-k56mb"] Apr 21 16:09:55.256681 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:55.256662 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:09:55.262390 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:55.262365 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-l895s\"" Apr 21 16:09:55.263199 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:55.263182 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 21 16:09:55.291548 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:55.291524 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-k56mb"] Apr 21 16:09:55.422566 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:55.422534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af849393-15af-4721-953e-a91eb386709d-cert\") pod \"kserve-controller-manager-856948b99f-k56mb\" (UID: \"af849393-15af-4721-953e-a91eb386709d\") " pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:09:55.422753 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:55.422590 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svzb5\" (UniqueName: \"kubernetes.io/projected/af849393-15af-4721-953e-a91eb386709d-kube-api-access-svzb5\") pod \"kserve-controller-manager-856948b99f-k56mb\" (UID: \"af849393-15af-4721-953e-a91eb386709d\") " pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:09:55.523567 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:55.523481 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svzb5\" (UniqueName: \"kubernetes.io/projected/af849393-15af-4721-953e-a91eb386709d-kube-api-access-svzb5\") pod \"kserve-controller-manager-856948b99f-k56mb\" (UID: \"af849393-15af-4721-953e-a91eb386709d\") " pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:09:55.523567 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:55.523529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af849393-15af-4721-953e-a91eb386709d-cert\") pod \"kserve-controller-manager-856948b99f-k56mb\" (UID: \"af849393-15af-4721-953e-a91eb386709d\") " pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:09:55.523797 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:09:55.523658 2577 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 21 16:09:55.523797 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:09:55.523722 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/af849393-15af-4721-953e-a91eb386709d-cert podName:af849393-15af-4721-953e-a91eb386709d nodeName:}" failed. No retries permitted until 2026-04-21 16:09:56.023700698 +0000 UTC m=+492.630602785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/af849393-15af-4721-953e-a91eb386709d-cert") pod "kserve-controller-manager-856948b99f-k56mb" (UID: "af849393-15af-4721-953e-a91eb386709d") : secret "kserve-webhook-server-cert" not found Apr 21 16:09:55.538623 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:55.538595 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svzb5\" (UniqueName: \"kubernetes.io/projected/af849393-15af-4721-953e-a91eb386709d-kube-api-access-svzb5\") pod \"kserve-controller-manager-856948b99f-k56mb\" (UID: \"af849393-15af-4721-953e-a91eb386709d\") " pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:09:56.027856 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:56.027816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af849393-15af-4721-953e-a91eb386709d-cert\") pod \"kserve-controller-manager-856948b99f-k56mb\" (UID: \"af849393-15af-4721-953e-a91eb386709d\") " pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:09:56.030211 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:56.030187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/af849393-15af-4721-953e-a91eb386709d-cert\") pod \"kserve-controller-manager-856948b99f-k56mb\" (UID: \"af849393-15af-4721-953e-a91eb386709d\") " pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:09:56.167761 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:56.167727 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:09:56.303050 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:56.302955 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-k56mb"] Apr 21 16:09:56.413825 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:56.413785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" event={"ID":"af849393-15af-4721-953e-a91eb386709d","Type":"ContainerStarted","Data":"3ff67448c758870cd22a3da8efc626eae267f9de2614fbe629119daec864830c"} Apr 21 16:09:57.714127 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.714076 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll"] Apr 21 16:09:57.716912 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.716895 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" Apr 21 16:09:57.719310 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.719287 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 21 16:09:57.719732 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.719713 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 21 16:09:57.719839 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.719755 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-vhs7d\"" Apr 21 16:09:57.732145 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.732116 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll"] Apr 21 16:09:57.778410 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.778378 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:09:57.778764 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.778749 2577 scope.go:117] "RemoveContainer" containerID="4c1c9b51e27e83aeb4fdc4d21d519df973628fe3f83b96146741c8d90dad75f4" Apr 21 16:09:57.778980 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:09:57.778964 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-bw97v_opendatahub(2423b3fc-41cc-40e4-956e-9f52f63198d3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" podUID="2423b3fc-41cc-40e4-956e-9f52f63198d3" Apr 21 16:09:57.839473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.839440 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6e85f9f6-a62a-4daa-8fa1-f88fd18a1043-operator-config\") pod \"servicemesh-operator3-55f49c5f94-5z5ll\" (UID: \"6e85f9f6-a62a-4daa-8fa1-f88fd18a1043\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" Apr 21 16:09:57.839644 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.839498 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtc4\" (UniqueName: \"kubernetes.io/projected/6e85f9f6-a62a-4daa-8fa1-f88fd18a1043-kube-api-access-frtc4\") pod \"servicemesh-operator3-55f49c5f94-5z5ll\" (UID: \"6e85f9f6-a62a-4daa-8fa1-f88fd18a1043\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" Apr 21 16:09:57.940681 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.940649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frtc4\" (UniqueName: \"kubernetes.io/projected/6e85f9f6-a62a-4daa-8fa1-f88fd18a1043-kube-api-access-frtc4\") pod \"servicemesh-operator3-55f49c5f94-5z5ll\" (UID: \"6e85f9f6-a62a-4daa-8fa1-f88fd18a1043\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" Apr 21 16:09:57.940876 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.940729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6e85f9f6-a62a-4daa-8fa1-f88fd18a1043-operator-config\") pod \"servicemesh-operator3-55f49c5f94-5z5ll\" (UID: \"6e85f9f6-a62a-4daa-8fa1-f88fd18a1043\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" Apr 21 16:09:57.943882 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.943855 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/6e85f9f6-a62a-4daa-8fa1-f88fd18a1043-operator-config\") pod \"servicemesh-operator3-55f49c5f94-5z5ll\" (UID: \"6e85f9f6-a62a-4daa-8fa1-f88fd18a1043\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" Apr 21 16:09:57.953159 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:57.953127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frtc4\" (UniqueName: \"kubernetes.io/projected/6e85f9f6-a62a-4daa-8fa1-f88fd18a1043-kube-api-access-frtc4\") pod \"servicemesh-operator3-55f49c5f94-5z5ll\" (UID: \"6e85f9f6-a62a-4daa-8fa1-f88fd18a1043\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" Apr 21 16:09:58.026201 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:58.026093 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" Apr 21 16:09:58.156420 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:58.156394 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll"] Apr 21 16:09:58.158926 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:09:58.158900 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e85f9f6_a62a_4daa_8fa1_f88fd18a1043.slice/crio-eebb541f812b21f71decb69eb2dd650114c96b5a34467d626d37b1dec4d04cb4 WatchSource:0}: Error finding container eebb541f812b21f71decb69eb2dd650114c96b5a34467d626d37b1dec4d04cb4: Status 404 returned error can't find the container with id eebb541f812b21f71decb69eb2dd650114c96b5a34467d626d37b1dec4d04cb4 Apr 21 16:09:58.421012 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:58.420979 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" event={"ID":"6e85f9f6-a62a-4daa-8fa1-f88fd18a1043","Type":"ContainerStarted","Data":"eebb541f812b21f71decb69eb2dd650114c96b5a34467d626d37b1dec4d04cb4"} Apr 21 16:09:59.426566 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:59.426527 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" event={"ID":"af849393-15af-4721-953e-a91eb386709d","Type":"ContainerStarted","Data":"87c44f99ed3341f5128283a47418bca805cdfc643104b3d77c91ef5bfc6089d0"} Apr 21 16:09:59.427034 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:59.426666 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:09:59.460706 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:09:59.460640 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" podStartSLOduration=1.779253194 podStartE2EDuration="4.460617349s" podCreationTimestamp="2026-04-21 16:09:55 +0000 UTC" firstStartedPulling="2026-04-21 16:09:56.313712542 +0000 UTC m=+492.920614642" lastFinishedPulling="2026-04-21 16:09:58.995076695 +0000 UTC m=+495.601978797" observedRunningTime="2026-04-21 16:09:59.457390215 +0000 UTC m=+496.064292323" watchObservedRunningTime="2026-04-21 16:09:59.460617349 +0000 UTC m=+496.067519458" Apr 21 16:10:01.434301 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:01.434263 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" event={"ID":"6e85f9f6-a62a-4daa-8fa1-f88fd18a1043","Type":"ContainerStarted","Data":"56ead406a639dd3f43eea8eab0433d616730ce2a9e521040659e3114a8819a31"} Apr 21 16:10:01.434704 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:01.434377 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" Apr 21 16:10:01.458720 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:01.458669 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" podStartSLOduration=1.8886177659999999 podStartE2EDuration="4.458654203s" podCreationTimestamp="2026-04-21 16:09:57 +0000 UTC" firstStartedPulling="2026-04-21 16:09:58.161536266 +0000 UTC m=+494.768438350" lastFinishedPulling="2026-04-21 16:10:00.731572698 +0000 UTC m=+497.338474787" observedRunningTime="2026-04-21 16:10:01.457332162 +0000 UTC m=+498.064234270" watchObservedRunningTime="2026-04-21 16:10:01.458654203 +0000 UTC m=+498.065556309" Apr 21 16:10:07.778163 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:07.778123 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:10:07.778654 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:07.778616 2577 scope.go:117] "RemoveContainer" containerID="4c1c9b51e27e83aeb4fdc4d21d519df973628fe3f83b96146741c8d90dad75f4" Apr 21 16:10:08.455740 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:08.455654 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" event={"ID":"2423b3fc-41cc-40e4-956e-9f52f63198d3","Type":"ContainerStarted","Data":"e9d1a756128aaed0ba7349681420c00b34a90ac0eb7f97529373981dd6b690d4"} Apr 21 16:10:08.455914 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:08.455759 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:10:08.505368 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:08.505319 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" podStartSLOduration=2.342324407 podStartE2EDuration="22.505303684s" podCreationTimestamp="2026-04-21 16:09:46 +0000 UTC" firstStartedPulling="2026-04-21 16:09:47.90214359 +0000 UTC m=+484.509045675" lastFinishedPulling="2026-04-21 16:10:08.065122867 +0000 UTC m=+504.672024952" observedRunningTime="2026-04-21 16:10:08.503735117 +0000 UTC m=+505.110637224" watchObservedRunningTime="2026-04-21 16:10:08.505303684 +0000 UTC m=+505.112205791" Apr 21 16:10:12.439125 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:12.439092 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-5z5ll" Apr 21 16:10:19.461707 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:19.461676 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-bw97v" Apr 21 16:10:22.422056 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.422024 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv"] Apr 21 16:10:22.446654 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.446417 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv"] Apr 21 16:10:22.446654 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.446592 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.449181 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.449158 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 21 16:10:22.449304 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.449243 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 21 16:10:22.449304 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.449264 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 21 16:10:22.449413 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.449347 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 21 16:10:22.449631 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.449610 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-56jkg\"" Apr 21 16:10:22.519444 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.519413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4eda884f-b350-476a-b497-386c049422d2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.519622 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.519459 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqpl9\" (UniqueName: \"kubernetes.io/projected/4eda884f-b350-476a-b497-386c049422d2-kube-api-access-fqpl9\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.519622 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.519499 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4eda884f-b350-476a-b497-386c049422d2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.519622 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.519525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4eda884f-b350-476a-b497-386c049422d2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.519622 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.519586 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4eda884f-b350-476a-b497-386c049422d2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.519862 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.519638 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4eda884f-b350-476a-b497-386c049422d2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.519862 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.519667 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4eda884f-b350-476a-b497-386c049422d2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.620062 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.620023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4eda884f-b350-476a-b497-386c049422d2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.620062 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.620065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4eda884f-b350-476a-b497-386c049422d2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.620310 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.620247 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4eda884f-b350-476a-b497-386c049422d2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.620371 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.620306 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqpl9\" (UniqueName: \"kubernetes.io/projected/4eda884f-b350-476a-b497-386c049422d2-kube-api-access-fqpl9\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.620371 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.620338 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4eda884f-b350-476a-b497-386c049422d2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.620371 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.620367 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4eda884f-b350-476a-b497-386c049422d2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.620532 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.620427 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4eda884f-b350-476a-b497-386c049422d2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.620862 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.620841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4eda884f-b350-476a-b497-386c049422d2-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.622382 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.622361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4eda884f-b350-476a-b497-386c049422d2-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.622572 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.622551 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4eda884f-b350-476a-b497-386c049422d2-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.622645 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.622607 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4eda884f-b350-476a-b497-386c049422d2-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.622691 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.622670 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4eda884f-b350-476a-b497-386c049422d2-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.631653 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.631630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqpl9\" (UniqueName: \"kubernetes.io/projected/4eda884f-b350-476a-b497-386c049422d2-kube-api-access-fqpl9\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.637887 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.637864 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4eda884f-b350-476a-b497-386c049422d2-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-mkphv\" (UID: \"4eda884f-b350-476a-b497-386c049422d2\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.757148 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.757062 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:22.887527 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:22.887491 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv"] Apr 21 16:10:22.891807 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:10:22.891765 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eda884f_b350_476a_b497_386c049422d2.slice/crio-781c82162e25e348588b528a589a84d4885aea0bde80f112ab732fd32d3a507e WatchSource:0}: Error finding container 781c82162e25e348588b528a589a84d4885aea0bde80f112ab732fd32d3a507e: Status 404 returned error can't find the container with id 781c82162e25e348588b528a589a84d4885aea0bde80f112ab732fd32d3a507e Apr 21 16:10:23.500278 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:23.500233 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" event={"ID":"4eda884f-b350-476a-b497-386c049422d2","Type":"ContainerStarted","Data":"781c82162e25e348588b528a589a84d4885aea0bde80f112ab732fd32d3a507e"} Apr 21 16:10:26.064121 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:26.064081 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 21 16:10:26.064495 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:26.064167 2577 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 21 16:10:26.511544 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:26.511443 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" event={"ID":"4eda884f-b350-476a-b497-386c049422d2","Type":"ContainerStarted","Data":"d690a9ac7bf32eefafbb20c6a71b08e769681a0797583724f34183c7327ad192"} Apr 21 16:10:26.511712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:26.511604 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:10:26.533512 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:26.533465 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" podStartSLOduration=1.3631994760000001 podStartE2EDuration="4.533434731s" podCreationTimestamp="2026-04-21 16:10:22 +0000 UTC" firstStartedPulling="2026-04-21 16:10:22.893643668 +0000 UTC m=+519.500545753" lastFinishedPulling="2026-04-21 16:10:26.063878923 +0000 UTC m=+522.670781008" observedRunningTime="2026-04-21 16:10:26.532626013 +0000 UTC m=+523.139528121" watchObservedRunningTime="2026-04-21 16:10:26.533434731 +0000 UTC m=+523.140336838" Apr 21 16:10:27.521345 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:27.521303 2577 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-mkphv container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 21 16:10:27.521811 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:27.521368 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" podUID="4eda884f-b350-476a-b497-386c049422d2" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 16:10:30.434533 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:30.434504 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-k56mb" Apr 21 16:10:30.516686 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:10:30.516653 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-mkphv" Apr 21 16:11:22.682735 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:22.682700 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-x956l"] Apr 21 16:11:22.685545 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:22.685529 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-x956l" Apr 21 16:11:22.691240 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:22.691219 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 16:11:22.692639 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:22.692618 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 16:11:22.692942 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:22.692923 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-z8ps8\"" Apr 21 16:11:22.727717 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:22.727673 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-x956l"] Apr 21 16:11:22.799260 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:22.799225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95zx6\" (UniqueName: \"kubernetes.io/projected/de92ee0c-6604-4ac8-82dd-fcc0ece17287-kube-api-access-95zx6\") pod \"authorino-operator-657f44b778-x956l\" (UID: \"de92ee0c-6604-4ac8-82dd-fcc0ece17287\") " pod="kuadrant-system/authorino-operator-657f44b778-x956l" Apr 21 16:11:22.900154 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:22.900122 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95zx6\" (UniqueName: \"kubernetes.io/projected/de92ee0c-6604-4ac8-82dd-fcc0ece17287-kube-api-access-95zx6\") pod \"authorino-operator-657f44b778-x956l\" (UID: \"de92ee0c-6604-4ac8-82dd-fcc0ece17287\") " pod="kuadrant-system/authorino-operator-657f44b778-x956l" Apr 21 16:11:22.916127 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:22.916102 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95zx6\" (UniqueName: \"kubernetes.io/projected/de92ee0c-6604-4ac8-82dd-fcc0ece17287-kube-api-access-95zx6\") pod \"authorino-operator-657f44b778-x956l\" (UID: \"de92ee0c-6604-4ac8-82dd-fcc0ece17287\") " pod="kuadrant-system/authorino-operator-657f44b778-x956l" Apr 21 16:11:22.995707 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:22.995548 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-x956l" Apr 21 16:11:23.119258 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:23.119226 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-x956l"] Apr 21 16:11:23.122752 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:11:23.122723 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde92ee0c_6604_4ac8_82dd_fcc0ece17287.slice/crio-d1925daef460b6c83185c908cc89b36b9f5d23a7461eaa0df1f5a338ecb296c7 WatchSource:0}: Error finding container d1925daef460b6c83185c908cc89b36b9f5d23a7461eaa0df1f5a338ecb296c7: Status 404 returned error can't find the container with id d1925daef460b6c83185c908cc89b36b9f5d23a7461eaa0df1f5a338ecb296c7 Apr 21 16:11:23.693430 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:23.693389 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-x956l" event={"ID":"de92ee0c-6604-4ac8-82dd-fcc0ece17287","Type":"ContainerStarted","Data":"d1925daef460b6c83185c908cc89b36b9f5d23a7461eaa0df1f5a338ecb296c7"} Apr 21 16:11:25.700826 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:25.700795 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-x956l" event={"ID":"de92ee0c-6604-4ac8-82dd-fcc0ece17287","Type":"ContainerStarted","Data":"ec1543dcbf26908e324b82e388429e647a8a448a49dd32c561921c726d92d103"} Apr 21 16:11:25.701201 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:25.700842 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-x956l" Apr 21 16:11:25.722411 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:25.722364 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-x956l" podStartSLOduration=1.724486475 podStartE2EDuration="3.722351177s" podCreationTimestamp="2026-04-21 16:11:22 +0000 UTC" firstStartedPulling="2026-04-21 16:11:23.124401047 +0000 UTC m=+579.731303132" lastFinishedPulling="2026-04-21 16:11:25.122265746 +0000 UTC m=+581.729167834" observedRunningTime="2026-04-21 16:11:25.720667297 +0000 UTC m=+582.327569416" watchObservedRunningTime="2026-04-21 16:11:25.722351177 +0000 UTC m=+582.329253284" Apr 21 16:11:36.706732 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:36.706701 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-x956l" Apr 21 16:11:43.987945 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:43.987917 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:11:43.989139 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:43.989117 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:11:48.078469 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.078430 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm"] Apr 21 16:11:48.081314 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.081294 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:11:48.085108 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.085087 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-x4qcz\"" Apr 21 16:11:48.098907 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.098874 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm"] Apr 21 16:11:48.166092 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.166058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/53d95bef-96c4-4eee-8773-e869d5c92f76-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rwfgm\" (UID: \"53d95bef-96c4-4eee-8773-e869d5c92f76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:11:48.166260 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.166113 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwgbk\" (UniqueName: \"kubernetes.io/projected/53d95bef-96c4-4eee-8773-e869d5c92f76-kube-api-access-cwgbk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rwfgm\" (UID: \"53d95bef-96c4-4eee-8773-e869d5c92f76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:11:48.266951 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.266911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/53d95bef-96c4-4eee-8773-e869d5c92f76-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rwfgm\" (UID: \"53d95bef-96c4-4eee-8773-e869d5c92f76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:11:48.267118 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.267065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwgbk\" (UniqueName: \"kubernetes.io/projected/53d95bef-96c4-4eee-8773-e869d5c92f76-kube-api-access-cwgbk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rwfgm\" (UID: \"53d95bef-96c4-4eee-8773-e869d5c92f76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:11:48.267283 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.267263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/53d95bef-96c4-4eee-8773-e869d5c92f76-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rwfgm\" (UID: \"53d95bef-96c4-4eee-8773-e869d5c92f76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:11:48.281563 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.281534 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwgbk\" (UniqueName: \"kubernetes.io/projected/53d95bef-96c4-4eee-8773-e869d5c92f76-kube-api-access-cwgbk\") pod \"kuadrant-operator-controller-manager-55c7f4c975-rwfgm\" (UID: \"53d95bef-96c4-4eee-8773-e869d5c92f76\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:11:48.392329 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.392246 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:11:48.531029 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.530975 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm"] Apr 21 16:11:48.533523 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:11:48.533495 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53d95bef_96c4_4eee_8773_e869d5c92f76.slice/crio-7651a3996709e3b9a5fcc2b031ffe7e5e0e6d1662e7e69a39a4bdfbeed8ed468 WatchSource:0}: Error finding container 7651a3996709e3b9a5fcc2b031ffe7e5e0e6d1662e7e69a39a4bdfbeed8ed468: Status 404 returned error can't find the container with id 7651a3996709e3b9a5fcc2b031ffe7e5e0e6d1662e7e69a39a4bdfbeed8ed468 Apr 21 16:11:48.776161 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:48.776073 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" event={"ID":"53d95bef-96c4-4eee-8773-e869d5c92f76","Type":"ContainerStarted","Data":"7651a3996709e3b9a5fcc2b031ffe7e5e0e6d1662e7e69a39a4bdfbeed8ed468"} Apr 21 16:11:53.793855 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:53.793824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" event={"ID":"53d95bef-96c4-4eee-8773-e869d5c92f76","Type":"ContainerStarted","Data":"ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc"} Apr 21 16:11:53.794235 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:53.793966 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:11:53.824173 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:11:53.824122 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" podStartSLOduration=1.43684808 podStartE2EDuration="5.824106289s" podCreationTimestamp="2026-04-21 16:11:48 +0000 UTC" firstStartedPulling="2026-04-21 16:11:48.536506235 +0000 UTC m=+605.143408324" lastFinishedPulling="2026-04-21 16:11:52.923764441 +0000 UTC m=+609.530666533" observedRunningTime="2026-04-21 16:11:53.821939406 +0000 UTC m=+610.428841512" watchObservedRunningTime="2026-04-21 16:11:53.824106289 +0000 UTC m=+610.431008396" Apr 21 16:12:04.799807 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:12:04.799709 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:13:01.964357 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:01.964320 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-759bd7ff6b-98n6m"] Apr 21 16:13:01.966961 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:01.966937 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" Apr 21 16:13:01.969485 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:01.969470 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-hr5hm\"" Apr 21 16:13:01.979092 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:01.979071 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-759bd7ff6b-98n6m"] Apr 21 16:13:02.045986 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:02.045957 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ts4r\" (UniqueName: \"kubernetes.io/projected/ab510dab-9963-44cc-8889-48a23253a2a9-kube-api-access-6ts4r\") pod \"maas-controller-759bd7ff6b-98n6m\" (UID: \"ab510dab-9963-44cc-8889-48a23253a2a9\") " pod="opendatahub/maas-controller-759bd7ff6b-98n6m" Apr 21 16:13:02.146655 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:02.146621 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ts4r\" (UniqueName: \"kubernetes.io/projected/ab510dab-9963-44cc-8889-48a23253a2a9-kube-api-access-6ts4r\") pod \"maas-controller-759bd7ff6b-98n6m\" (UID: \"ab510dab-9963-44cc-8889-48a23253a2a9\") " pod="opendatahub/maas-controller-759bd7ff6b-98n6m" Apr 21 16:13:02.155187 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:02.155159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ts4r\" (UniqueName: \"kubernetes.io/projected/ab510dab-9963-44cc-8889-48a23253a2a9-kube-api-access-6ts4r\") pod \"maas-controller-759bd7ff6b-98n6m\" (UID: \"ab510dab-9963-44cc-8889-48a23253a2a9\") " pod="opendatahub/maas-controller-759bd7ff6b-98n6m" Apr 21 16:13:02.276801 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:02.276698 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" Apr 21 16:13:02.405228 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:02.405202 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-759bd7ff6b-98n6m"] Apr 21 16:13:02.407877 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:13:02.407849 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab510dab_9963_44cc_8889_48a23253a2a9.slice/crio-c0ae31c6bba760d3964fb0524352ecea7fa480efb2dc9a59abe853754c03eb36 WatchSource:0}: Error finding container c0ae31c6bba760d3964fb0524352ecea7fa480efb2dc9a59abe853754c03eb36: Status 404 returned error can't find the container with id c0ae31c6bba760d3964fb0524352ecea7fa480efb2dc9a59abe853754c03eb36 Apr 21 16:13:03.007600 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:03.007557 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" event={"ID":"ab510dab-9963-44cc-8889-48a23253a2a9","Type":"ContainerStarted","Data":"c0ae31c6bba760d3964fb0524352ecea7fa480efb2dc9a59abe853754c03eb36"} Apr 21 16:13:05.016243 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:05.016205 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" event={"ID":"ab510dab-9963-44cc-8889-48a23253a2a9","Type":"ContainerStarted","Data":"d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4"} Apr 21 16:13:05.016643 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:05.016432 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" Apr 21 16:13:05.034664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:05.034608 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" podStartSLOduration=1.50111306 podStartE2EDuration="4.034594412s" podCreationTimestamp="2026-04-21 16:13:01 +0000 UTC" firstStartedPulling="2026-04-21 16:13:02.40943941 +0000 UTC m=+679.016341495" lastFinishedPulling="2026-04-21 16:13:04.94292075 +0000 UTC m=+681.549822847" observedRunningTime="2026-04-21 16:13:05.033124164 +0000 UTC m=+681.640026307" watchObservedRunningTime="2026-04-21 16:13:05.034594412 +0000 UTC m=+681.641496538" Apr 21 16:13:16.025297 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:16.025265 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" Apr 21 16:13:16.349791 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:16.349738 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-548ffc956-78czx"] Apr 21 16:13:16.352162 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:16.352137 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-548ffc956-78czx" Apr 21 16:13:16.364193 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:16.364165 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-548ffc956-78czx"] Apr 21 16:13:16.450401 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:16.450361 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm48q\" (UniqueName: \"kubernetes.io/projected/40b1a341-f51a-4e2e-8231-96f363ba818a-kube-api-access-tm48q\") pod \"maas-controller-548ffc956-78czx\" (UID: \"40b1a341-f51a-4e2e-8231-96f363ba818a\") " pod="opendatahub/maas-controller-548ffc956-78czx" Apr 21 16:13:16.551500 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:16.551460 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tm48q\" (UniqueName: \"kubernetes.io/projected/40b1a341-f51a-4e2e-8231-96f363ba818a-kube-api-access-tm48q\") pod \"maas-controller-548ffc956-78czx\" (UID: \"40b1a341-f51a-4e2e-8231-96f363ba818a\") " pod="opendatahub/maas-controller-548ffc956-78czx" Apr 21 16:13:16.560037 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:16.560007 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm48q\" (UniqueName: \"kubernetes.io/projected/40b1a341-f51a-4e2e-8231-96f363ba818a-kube-api-access-tm48q\") pod \"maas-controller-548ffc956-78czx\" (UID: \"40b1a341-f51a-4e2e-8231-96f363ba818a\") " pod="opendatahub/maas-controller-548ffc956-78czx" Apr 21 16:13:16.663093 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:16.663005 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-548ffc956-78czx" Apr 21 16:13:16.784871 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:16.784847 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-548ffc956-78czx"] Apr 21 16:13:16.787092 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:13:16.787065 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40b1a341_f51a_4e2e_8231_96f363ba818a.slice/crio-8f8b305790c978368ede986409e3b430e204f670eab075c732a9b14d4cbb6e06 WatchSource:0}: Error finding container 8f8b305790c978368ede986409e3b430e204f670eab075c732a9b14d4cbb6e06: Status 404 returned error can't find the container with id 8f8b305790c978368ede986409e3b430e204f670eab075c732a9b14d4cbb6e06 Apr 21 16:13:17.054259 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:17.054176 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-548ffc956-78czx" event={"ID":"40b1a341-f51a-4e2e-8231-96f363ba818a","Type":"ContainerStarted","Data":"8f8b305790c978368ede986409e3b430e204f670eab075c732a9b14d4cbb6e06"} Apr 21 16:13:18.058251 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:18.058219 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-548ffc956-78czx" event={"ID":"40b1a341-f51a-4e2e-8231-96f363ba818a","Type":"ContainerStarted","Data":"8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa"} Apr 21 16:13:18.058728 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:18.058337 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-548ffc956-78czx" Apr 21 16:13:18.077121 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:18.077076 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-548ffc956-78czx" podStartSLOduration=1.730594454 podStartE2EDuration="2.077061813s" podCreationTimestamp="2026-04-21 16:13:16 +0000 UTC" firstStartedPulling="2026-04-21 16:13:16.78829262 +0000 UTC m=+693.395194705" lastFinishedPulling="2026-04-21 16:13:17.134759966 +0000 UTC m=+693.741662064" observedRunningTime="2026-04-21 16:13:18.07544478 +0000 UTC m=+694.682346887" watchObservedRunningTime="2026-04-21 16:13:18.077061813 +0000 UTC m=+694.683963920" Apr 21 16:13:29.067339 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:29.067302 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-548ffc956-78czx" Apr 21 16:13:29.108320 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:29.108288 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-759bd7ff6b-98n6m"] Apr 21 16:13:29.108540 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:29.108498 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" podUID="ab510dab-9963-44cc-8889-48a23253a2a9" containerName="manager" containerID="cri-o://d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4" gracePeriod=10 Apr 21 16:13:29.346076 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:29.346052 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" Apr 21 16:13:29.456265 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:29.456228 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ts4r\" (UniqueName: \"kubernetes.io/projected/ab510dab-9963-44cc-8889-48a23253a2a9-kube-api-access-6ts4r\") pod \"ab510dab-9963-44cc-8889-48a23253a2a9\" (UID: \"ab510dab-9963-44cc-8889-48a23253a2a9\") " Apr 21 16:13:29.458499 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:29.458471 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab510dab-9963-44cc-8889-48a23253a2a9-kube-api-access-6ts4r" (OuterVolumeSpecName: "kube-api-access-6ts4r") pod "ab510dab-9963-44cc-8889-48a23253a2a9" (UID: "ab510dab-9963-44cc-8889-48a23253a2a9"). InnerVolumeSpecName "kube-api-access-6ts4r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:13:29.557511 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:29.557476 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6ts4r\" (UniqueName: \"kubernetes.io/projected/ab510dab-9963-44cc-8889-48a23253a2a9-kube-api-access-6ts4r\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:13:30.097729 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:30.097696 2577 generic.go:358] "Generic (PLEG): container finished" podID="ab510dab-9963-44cc-8889-48a23253a2a9" containerID="d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4" exitCode=0 Apr 21 16:13:30.098161 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:30.097754 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" Apr 21 16:13:30.098161 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:30.097802 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" event={"ID":"ab510dab-9963-44cc-8889-48a23253a2a9","Type":"ContainerDied","Data":"d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4"} Apr 21 16:13:30.098161 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:30.097846 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-759bd7ff6b-98n6m" event={"ID":"ab510dab-9963-44cc-8889-48a23253a2a9","Type":"ContainerDied","Data":"c0ae31c6bba760d3964fb0524352ecea7fa480efb2dc9a59abe853754c03eb36"} Apr 21 16:13:30.098161 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:30.097862 2577 scope.go:117] "RemoveContainer" containerID="d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4" Apr 21 16:13:30.107695 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:30.107677 2577 scope.go:117] "RemoveContainer" containerID="d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4" Apr 21 16:13:30.107978 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:13:30.107955 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4\": container with ID starting with d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4 not found: ID does not exist" containerID="d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4" Apr 21 16:13:30.108091 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:30.107982 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4"} err="failed to get container status \"d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4\": rpc error: code = NotFound desc = could not find container \"d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4\": container with ID starting with d07f2af27c99454644f7598178904a993369090d1bc5dcc3c692df75502dfbd4 not found: ID does not exist" Apr 21 16:13:30.118176 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:30.118149 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-759bd7ff6b-98n6m"] Apr 21 16:13:30.123175 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:30.123154 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-759bd7ff6b-98n6m"] Apr 21 16:13:32.054473 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:32.054438 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab510dab-9963-44cc-8889-48a23253a2a9" path="/var/lib/kubelet/pods/ab510dab-9963-44cc-8889-48a23253a2a9/volumes" Apr 21 16:13:47.757968 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.757932 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5"] Apr 21 16:13:47.758409 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.758254 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab510dab-9963-44cc-8889-48a23253a2a9" containerName="manager" Apr 21 16:13:47.758409 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.758267 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab510dab-9963-44cc-8889-48a23253a2a9" containerName="manager" Apr 21 16:13:47.758409 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.758333 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab510dab-9963-44cc-8889-48a23253a2a9" containerName="manager" Apr 21 16:13:47.763717 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.763697 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:47.766644 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.766624 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 21 16:13:47.767944 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.767927 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-wpv9r\"" Apr 21 16:13:47.768142 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.768127 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 16:13:47.768195 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.768158 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 16:13:47.771927 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.771906 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5"] Apr 21 16:13:47.910138 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.909872 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:47.910138 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.909920 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/350e3117-8b4e-488d-a5e4-5223c951d17c-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:47.910138 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.909956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:47.910138 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.909987 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:47.910138 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.910020 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:47.910138 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:47.910058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzdt\" (UniqueName: \"kubernetes.io/projected/350e3117-8b4e-488d-a5e4-5223c951d17c-kube-api-access-6mzdt\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.011837 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.010824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.011837 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.010876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/350e3117-8b4e-488d-a5e4-5223c951d17c-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.011837 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.010910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.011837 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.010942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.011837 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.010977 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.011837 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.011021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzdt\" (UniqueName: \"kubernetes.io/projected/350e3117-8b4e-488d-a5e4-5223c951d17c-kube-api-access-6mzdt\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.012267 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.012242 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.012967 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.012938 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.013219 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.013195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.015068 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.015027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/350e3117-8b4e-488d-a5e4-5223c951d17c-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.015343 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.015319 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/350e3117-8b4e-488d-a5e4-5223c951d17c-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.020575 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.020551 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzdt\" (UniqueName: \"kubernetes.io/projected/350e3117-8b4e-488d-a5e4-5223c951d17c-kube-api-access-6mzdt\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5\" (UID: \"350e3117-8b4e-488d-a5e4-5223c951d17c\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.074423 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.074390 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:13:48.215094 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:48.215063 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5"] Apr 21 16:13:49.159765 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:49.159724 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" event={"ID":"350e3117-8b4e-488d-a5e4-5223c951d17c","Type":"ContainerStarted","Data":"e9880363d20b9529ee6aea6def222d6da5b32034c9ca097493d205a7940581fa"} Apr 21 16:13:54.179352 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:13:54.179315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" event={"ID":"350e3117-8b4e-488d-a5e4-5223c951d17c","Type":"ContainerStarted","Data":"fa29f94e48523edd37a59409b7c9c782c7bf5941759792bc53058e6f00aa27a0"} Apr 21 16:14:00.199540 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:00.199505 2577 generic.go:358] "Generic (PLEG): container finished" podID="350e3117-8b4e-488d-a5e4-5223c951d17c" containerID="fa29f94e48523edd37a59409b7c9c782c7bf5941759792bc53058e6f00aa27a0" exitCode=0 Apr 21 16:14:00.200018 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:00.199584 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" event={"ID":"350e3117-8b4e-488d-a5e4-5223c951d17c","Type":"ContainerDied","Data":"fa29f94e48523edd37a59409b7c9c782c7bf5941759792bc53058e6f00aa27a0"} Apr 21 16:14:02.215541 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:02.214700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" event={"ID":"350e3117-8b4e-488d-a5e4-5223c951d17c","Type":"ContainerStarted","Data":"068b5c37517ff88c1e58f3e4485c80dffd1f07900d2c89e9b564467b8a2bbed8"} Apr 21 16:14:02.215541 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:02.215498 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:14:02.236154 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:02.236102 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" podStartSLOduration=2.137033591 podStartE2EDuration="15.236089539s" podCreationTimestamp="2026-04-21 16:13:47 +0000 UTC" firstStartedPulling="2026-04-21 16:13:48.216853728 +0000 UTC m=+724.823755818" lastFinishedPulling="2026-04-21 16:14:01.315909662 +0000 UTC m=+737.922811766" observedRunningTime="2026-04-21 16:14:02.233660508 +0000 UTC m=+738.840562643" watchObservedRunningTime="2026-04-21 16:14:02.236089539 +0000 UTC m=+738.842991647" Apr 21 16:14:13.235048 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:13.235015 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5" Apr 21 16:14:21.657368 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.657329 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778"] Apr 21 16:14:21.687497 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.687469 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778"] Apr 21 16:14:21.687645 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.687583 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.690571 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.690550 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 21 16:14:21.787594 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.787564 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.787594 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.787598 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.787836 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.787658 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.787836 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.787728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.787836 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.787758 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.787836 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.787807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld8tv\" (UniqueName: \"kubernetes.io/projected/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-kube-api-access-ld8tv\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.888866 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.888824 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.888866 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.888871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.889117 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.888924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.889117 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.888983 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.889117 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.889015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.889117 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.889035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld8tv\" (UniqueName: \"kubernetes.io/projected/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-kube-api-access-ld8tv\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.889584 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.889433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.889584 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.889491 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.889584 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.889521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.891206 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.891184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.891344 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.891326 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.897947 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.897927 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld8tv\" (UniqueName: \"kubernetes.io/projected/fe45ed93-8e23-4a44-be3e-c93b37f3e6e4-kube-api-access-ld8tv\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-lh778\" (UID: \"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:21.997256 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:21.997182 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:22.127708 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:22.127684 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778"] Apr 21 16:14:22.130073 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:14:22.130037 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe45ed93_8e23_4a44_be3e_c93b37f3e6e4.slice/crio-6c10b0595d542ef0fcd6d8046d90264691b358a2082b42019680ee871a9f813c WatchSource:0}: Error finding container 6c10b0595d542ef0fcd6d8046d90264691b358a2082b42019680ee871a9f813c: Status 404 returned error can't find the container with id 6c10b0595d542ef0fcd6d8046d90264691b358a2082b42019680ee871a9f813c Apr 21 16:14:22.131970 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:22.131953 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:14:22.287648 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:22.287562 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" event={"ID":"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4","Type":"ContainerStarted","Data":"a59fbe3466c5fb79dabea31d38d352741e7cecb7140c7f73ec9472ca0c9dc97b"} Apr 21 16:14:22.287648 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:22.287597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" event={"ID":"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4","Type":"ContainerStarted","Data":"6c10b0595d542ef0fcd6d8046d90264691b358a2082b42019680ee871a9f813c"} Apr 21 16:14:29.312204 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:29.312167 2577 generic.go:358] "Generic (PLEG): container finished" podID="fe45ed93-8e23-4a44-be3e-c93b37f3e6e4" containerID="a59fbe3466c5fb79dabea31d38d352741e7cecb7140c7f73ec9472ca0c9dc97b" exitCode=0 Apr 21 16:14:29.312581 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:29.312251 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" event={"ID":"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4","Type":"ContainerDied","Data":"a59fbe3466c5fb79dabea31d38d352741e7cecb7140c7f73ec9472ca0c9dc97b"} Apr 21 16:14:30.317797 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:30.317739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" event={"ID":"fe45ed93-8e23-4a44-be3e-c93b37f3e6e4","Type":"ContainerStarted","Data":"d536285a21c8a96b370ebab72425aaafc1ba7b2af1414d959f119799bc726595"} Apr 21 16:14:30.318248 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:30.318135 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:30.345344 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:30.345295 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" podStartSLOduration=9.087978198 podStartE2EDuration="9.34528146s" podCreationTimestamp="2026-04-21 16:14:21 +0000 UTC" firstStartedPulling="2026-04-21 16:14:29.313054603 +0000 UTC m=+765.919956696" lastFinishedPulling="2026-04-21 16:14:29.570357872 +0000 UTC m=+766.177259958" observedRunningTime="2026-04-21 16:14:30.343729246 +0000 UTC m=+766.950631354" watchObservedRunningTime="2026-04-21 16:14:30.34528146 +0000 UTC m=+766.952183567" Apr 21 16:14:33.357229 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.357186 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc"] Apr 21 16:14:33.361461 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.361435 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.364222 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.364197 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 21 16:14:33.383436 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.383413 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc"] Apr 21 16:14:33.481046 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.481006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.481046 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.481052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.481267 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.481130 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71164a75-22e7-49a2-a405-4df52c142879-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.481267 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.481181 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.481267 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.481237 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.481381 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.481273 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fl6\" (UniqueName: \"kubernetes.io/projected/71164a75-22e7-49a2-a405-4df52c142879-kube-api-access-99fl6\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.582664 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.582616 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.582855 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.582677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71164a75-22e7-49a2-a405-4df52c142879-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.582855 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.582716 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.582855 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.582746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.582855 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.582817 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99fl6\" (UniqueName: \"kubernetes.io/projected/71164a75-22e7-49a2-a405-4df52c142879-kube-api-access-99fl6\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.583073 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.582873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.583130 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.583106 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.583130 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.583119 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.583223 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.583203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.585094 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.585071 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/71164a75-22e7-49a2-a405-4df52c142879-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.585193 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.585176 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/71164a75-22e7-49a2-a405-4df52c142879-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.591835 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.591792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fl6\" (UniqueName: \"kubernetes.io/projected/71164a75-22e7-49a2-a405-4df52c142879-kube-api-access-99fl6\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc\" (UID: \"71164a75-22e7-49a2-a405-4df52c142879\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.673724 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.673645 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:33.801939 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:33.801904 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc"] Apr 21 16:14:33.805386 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:14:33.805358 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71164a75_22e7_49a2_a405_4df52c142879.slice/crio-5395be3ccf6a38123b4e25f1f3bdbfa22b5e63257cd30e5917df6102a81023e4 WatchSource:0}: Error finding container 5395be3ccf6a38123b4e25f1f3bdbfa22b5e63257cd30e5917df6102a81023e4: Status 404 returned error can't find the container with id 5395be3ccf6a38123b4e25f1f3bdbfa22b5e63257cd30e5917df6102a81023e4 Apr 21 16:14:34.333714 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:34.333676 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" event={"ID":"71164a75-22e7-49a2-a405-4df52c142879","Type":"ContainerStarted","Data":"2265ed6753d8a1eae8a749afbb90a6f45bd53537b01fa9bfdb5c7659b13d3858"} Apr 21 16:14:34.333714 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:34.333718 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" event={"ID":"71164a75-22e7-49a2-a405-4df52c142879","Type":"ContainerStarted","Data":"5395be3ccf6a38123b4e25f1f3bdbfa22b5e63257cd30e5917df6102a81023e4"} Apr 21 16:14:40.354200 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:40.354166 2577 generic.go:358] "Generic (PLEG): container finished" podID="71164a75-22e7-49a2-a405-4df52c142879" containerID="2265ed6753d8a1eae8a749afbb90a6f45bd53537b01fa9bfdb5c7659b13d3858" exitCode=0 Apr 21 16:14:40.354597 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:40.354246 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" event={"ID":"71164a75-22e7-49a2-a405-4df52c142879","Type":"ContainerDied","Data":"2265ed6753d8a1eae8a749afbb90a6f45bd53537b01fa9bfdb5c7659b13d3858"} Apr 21 16:14:41.335094 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:41.335058 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-lh778" Apr 21 16:14:41.359460 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:41.359427 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" event={"ID":"71164a75-22e7-49a2-a405-4df52c142879","Type":"ContainerStarted","Data":"9fd5e21d2e85d561d721123fefd6290b08a5f04e25b62bb8b83fc94342b70827"} Apr 21 16:14:41.359831 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:41.359648 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:14:41.392866 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:41.392817 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" podStartSLOduration=8.147485767 podStartE2EDuration="8.392798694s" podCreationTimestamp="2026-04-21 16:14:33 +0000 UTC" firstStartedPulling="2026-04-21 16:14:40.354836609 +0000 UTC m=+776.961738693" lastFinishedPulling="2026-04-21 16:14:40.600149518 +0000 UTC m=+777.207051620" observedRunningTime="2026-04-21 16:14:41.392654659 +0000 UTC m=+777.999556767" watchObservedRunningTime="2026-04-21 16:14:41.392798694 +0000 UTC m=+777.999700799" Apr 21 16:14:52.376757 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:14:52.376724 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc" Apr 21 16:16:41.536526 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.536474 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-548ffc956-78czx"] Apr 21 16:16:41.537091 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.536863 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-548ffc956-78czx" podUID="40b1a341-f51a-4e2e-8231-96f363ba818a" containerName="manager" containerID="cri-o://8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa" gracePeriod=10 Apr 21 16:16:41.768328 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.768303 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-548ffc956-78czx" Apr 21 16:16:41.776368 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.776340 2577 generic.go:358] "Generic (PLEG): container finished" podID="40b1a341-f51a-4e2e-8231-96f363ba818a" containerID="8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa" exitCode=0 Apr 21 16:16:41.776475 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.776393 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-548ffc956-78czx" Apr 21 16:16:41.776475 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.776427 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-548ffc956-78czx" event={"ID":"40b1a341-f51a-4e2e-8231-96f363ba818a","Type":"ContainerDied","Data":"8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa"} Apr 21 16:16:41.776475 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.776459 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-548ffc956-78czx" event={"ID":"40b1a341-f51a-4e2e-8231-96f363ba818a","Type":"ContainerDied","Data":"8f8b305790c978368ede986409e3b430e204f670eab075c732a9b14d4cbb6e06"} Apr 21 16:16:41.776597 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.776479 2577 scope.go:117] "RemoveContainer" containerID="8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa" Apr 21 16:16:41.783838 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.783822 2577 scope.go:117] "RemoveContainer" containerID="8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa" Apr 21 16:16:41.784096 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:16:41.784078 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa\": container with ID starting with 8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa not found: ID does not exist" containerID="8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa" Apr 21 16:16:41.784161 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.784105 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa"} err="failed to get container status \"8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa\": rpc error: code = NotFound desc = could not find container \"8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa\": container with ID starting with 8877c559e96952ceb245b0c15e8779d3557a53121a46547c50ea4f65fd36b8aa not found: ID does not exist" Apr 21 16:16:41.849865 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.849837 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm48q\" (UniqueName: \"kubernetes.io/projected/40b1a341-f51a-4e2e-8231-96f363ba818a-kube-api-access-tm48q\") pod \"40b1a341-f51a-4e2e-8231-96f363ba818a\" (UID: \"40b1a341-f51a-4e2e-8231-96f363ba818a\") " Apr 21 16:16:41.851965 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.851935 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b1a341-f51a-4e2e-8231-96f363ba818a-kube-api-access-tm48q" (OuterVolumeSpecName: "kube-api-access-tm48q") pod "40b1a341-f51a-4e2e-8231-96f363ba818a" (UID: "40b1a341-f51a-4e2e-8231-96f363ba818a"). InnerVolumeSpecName "kube-api-access-tm48q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:16:41.950315 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:41.950282 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tm48q\" (UniqueName: \"kubernetes.io/projected/40b1a341-f51a-4e2e-8231-96f363ba818a-kube-api-access-tm48q\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:16:42.099358 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:42.099324 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-548ffc956-78czx"] Apr 21 16:16:42.114043 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:42.113976 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-548ffc956-78czx"] Apr 21 16:16:44.009455 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:44.009425 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:16:44.010928 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:44.010906 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:16:44.054106 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:16:44.054072 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b1a341-f51a-4e2e-8231-96f363ba818a" path="/var/lib/kubelet/pods/40b1a341-f51a-4e2e-8231-96f363ba818a/volumes" Apr 21 16:21:44.033706 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:21:44.033674 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:21:44.035842 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:21:44.035820 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:26:44.055213 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:44.055183 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:26:44.058452 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:44.058435 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:26:55.754712 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:55.754675 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm"] Apr 21 16:26:55.755132 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:55.754944 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" podUID="53d95bef-96c4-4eee-8773-e869d5c92f76" containerName="manager" containerID="cri-o://ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc" gracePeriod=10 Apr 21 16:26:56.792047 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.792024 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:26:56.807149 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.807119 2577 generic.go:358] "Generic (PLEG): container finished" podID="53d95bef-96c4-4eee-8773-e869d5c92f76" containerID="ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc" exitCode=0 Apr 21 16:26:56.807280 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.807175 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" Apr 21 16:26:56.807280 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.807182 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" event={"ID":"53d95bef-96c4-4eee-8773-e869d5c92f76","Type":"ContainerDied","Data":"ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc"} Apr 21 16:26:56.807280 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.807209 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm" event={"ID":"53d95bef-96c4-4eee-8773-e869d5c92f76","Type":"ContainerDied","Data":"7651a3996709e3b9a5fcc2b031ffe7e5e0e6d1662e7e69a39a4bdfbeed8ed468"} Apr 21 16:26:56.807280 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.807225 2577 scope.go:117] "RemoveContainer" containerID="ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc" Apr 21 16:26:56.815450 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.815433 2577 scope.go:117] "RemoveContainer" containerID="ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc" Apr 21 16:26:56.815693 ip-10-0-129-96 kubenswrapper[2577]: E0421 16:26:56.815669 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc\": container with ID starting with ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc not found: ID does not exist" containerID="ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc" Apr 21 16:26:56.815735 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.815702 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc"} err="failed to get container status \"ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc\": rpc error: code = NotFound desc = could not find container \"ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc\": container with ID starting with ec0630d6a75e5d0b9346c56f63c77f514c1f4b568e00172fb8f5d319f8a033fc not found: ID does not exist" Apr 21 16:26:56.940836 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.940744 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/53d95bef-96c4-4eee-8773-e869d5c92f76-extensions-socket-volume\") pod \"53d95bef-96c4-4eee-8773-e869d5c92f76\" (UID: \"53d95bef-96c4-4eee-8773-e869d5c92f76\") " Apr 21 16:26:56.941000 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.940834 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwgbk\" (UniqueName: \"kubernetes.io/projected/53d95bef-96c4-4eee-8773-e869d5c92f76-kube-api-access-cwgbk\") pod \"53d95bef-96c4-4eee-8773-e869d5c92f76\" (UID: \"53d95bef-96c4-4eee-8773-e869d5c92f76\") " Apr 21 16:26:56.941167 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.941143 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53d95bef-96c4-4eee-8773-e869d5c92f76-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "53d95bef-96c4-4eee-8773-e869d5c92f76" (UID: "53d95bef-96c4-4eee-8773-e869d5c92f76"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 16:26:56.942929 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:56.942904 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d95bef-96c4-4eee-8773-e869d5c92f76-kube-api-access-cwgbk" (OuterVolumeSpecName: "kube-api-access-cwgbk") pod "53d95bef-96c4-4eee-8773-e869d5c92f76" (UID: "53d95bef-96c4-4eee-8773-e869d5c92f76"). InnerVolumeSpecName "kube-api-access-cwgbk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 16:26:57.042073 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:57.042032 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cwgbk\" (UniqueName: \"kubernetes.io/projected/53d95bef-96c4-4eee-8773-e869d5c92f76-kube-api-access-cwgbk\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:26:57.042073 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:57.042068 2577 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/53d95bef-96c4-4eee-8773-e869d5c92f76-extensions-socket-volume\") on node \"ip-10-0-129-96.ec2.internal\" DevicePath \"\"" Apr 21 16:26:57.129886 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:57.129855 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm"] Apr 21 16:26:57.137077 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:57.137054 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-rwfgm"] Apr 21 16:26:58.054693 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:26:58.054656 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d95bef-96c4-4eee-8773-e869d5c92f76" path="/var/lib/kubelet/pods/53d95bef-96c4-4eee-8773-e869d5c92f76/volumes" Apr 21 16:28:01.829334 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.829299 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2"] Apr 21 16:28:01.829802 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.829582 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40b1a341-f51a-4e2e-8231-96f363ba818a" containerName="manager" Apr 21 16:28:01.829802 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.829592 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b1a341-f51a-4e2e-8231-96f363ba818a" containerName="manager" Apr 21 16:28:01.829802 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.829614 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53d95bef-96c4-4eee-8773-e869d5c92f76" containerName="manager" Apr 21 16:28:01.829802 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.829620 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d95bef-96c4-4eee-8773-e869d5c92f76" containerName="manager" Apr 21 16:28:01.829802 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.829665 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="40b1a341-f51a-4e2e-8231-96f363ba818a" containerName="manager" Apr 21 16:28:01.829802 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.829673 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="53d95bef-96c4-4eee-8773-e869d5c92f76" containerName="manager" Apr 21 16:28:01.832847 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.832821 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" Apr 21 16:28:01.837840 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.837820 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-x4qcz\"" Apr 21 16:28:01.851846 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.851821 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2"] Apr 21 16:28:01.983700 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.983664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttsbh\" (UniqueName: \"kubernetes.io/projected/aae08bd4-780e-45a7-b743-ad777540c26e-kube-api-access-ttsbh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nbzc2\" (UID: \"aae08bd4-780e-45a7-b743-ad777540c26e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" Apr 21 16:28:01.983909 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:01.983722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/aae08bd4-780e-45a7-b743-ad777540c26e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nbzc2\" (UID: \"aae08bd4-780e-45a7-b743-ad777540c26e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" Apr 21 16:28:02.084434 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:02.084347 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttsbh\" (UniqueName: \"kubernetes.io/projected/aae08bd4-780e-45a7-b743-ad777540c26e-kube-api-access-ttsbh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nbzc2\" (UID: \"aae08bd4-780e-45a7-b743-ad777540c26e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" Apr 21 16:28:02.084434 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:02.084412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/aae08bd4-780e-45a7-b743-ad777540c26e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nbzc2\" (UID: \"aae08bd4-780e-45a7-b743-ad777540c26e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" Apr 21 16:28:02.084747 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:02.084731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/aae08bd4-780e-45a7-b743-ad777540c26e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nbzc2\" (UID: \"aae08bd4-780e-45a7-b743-ad777540c26e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" Apr 21 16:28:02.094542 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:02.094510 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttsbh\" (UniqueName: \"kubernetes.io/projected/aae08bd4-780e-45a7-b743-ad777540c26e-kube-api-access-ttsbh\") pod \"kuadrant-operator-controller-manager-55c7f4c975-nbzc2\" (UID: \"aae08bd4-780e-45a7-b743-ad777540c26e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" Apr 21 16:28:02.142524 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:02.142492 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" Apr 21 16:28:02.269025 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:02.268980 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2"] Apr 21 16:28:02.272440 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:28:02.272415 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae08bd4_780e_45a7_b743_ad777540c26e.slice/crio-f280108769d37007439883f41c22efa503c05b93d109f58766f88b3e9031eb0a WatchSource:0}: Error finding container f280108769d37007439883f41c22efa503c05b93d109f58766f88b3e9031eb0a: Status 404 returned error can't find the container with id f280108769d37007439883f41c22efa503c05b93d109f58766f88b3e9031eb0a Apr 21 16:28:02.274698 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:02.274683 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:28:03.025142 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:03.025101 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" event={"ID":"aae08bd4-780e-45a7-b743-ad777540c26e","Type":"ContainerStarted","Data":"58e98d833cdfe8315663c60ef06bdfe9d5707e39bd316745d218bc1a39ab6255"} Apr 21 16:28:03.025142 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:03.025140 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" event={"ID":"aae08bd4-780e-45a7-b743-ad777540c26e","Type":"ContainerStarted","Data":"f280108769d37007439883f41c22efa503c05b93d109f58766f88b3e9031eb0a"} Apr 21 16:28:03.025568 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:03.025223 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" Apr 21 16:28:03.050287 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:03.050235 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" podStartSLOduration=2.050221267 podStartE2EDuration="2.050221267s" podCreationTimestamp="2026-04-21 16:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:28:03.047587418 +0000 UTC m=+1579.654489525" watchObservedRunningTime="2026-04-21 16:28:03.050221267 +0000 UTC m=+1579.657123374" Apr 21 16:28:14.031421 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:28:14.031389 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-nbzc2" Apr 21 16:31:44.082423 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:31:44.082301 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:31:44.085676 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:31:44.085655 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:36:44.102940 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:36:44.102825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:36:44.106855 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:36:44.106152 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6rkhl_06e52eb9-a2bb-4db2-8e4f-f435db21156c/ovn-acl-logging/0.log" Apr 21 16:37:42.471397 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:42.471295 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-k56mb_af849393-15af-4721-953e-a91eb386709d/manager/0.log" Apr 21 16:37:42.827359 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:42.827320 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-bw97v_2423b3fc-41cc-40e4-956e-9f52f63198d3/manager/2.log" Apr 21 16:37:43.055884 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:43.055853 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-774f54dc87-st7dm_26789298-0cab-4321-846d-66101be60413/manager/0.log" Apr 21 16:37:44.667245 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:44.667206 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-x956l_de92ee0c-6604-4ac8-82dd-fcc0ece17287/manager/0.log" Apr 21 16:37:45.115415 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:45.115379 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-nbzc2_aae08bd4-780e-45a7-b743-ad777540c26e/manager/0.log" Apr 21 16:37:45.795751 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:45.795703 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mkphv_4eda884f-b350-476a-b497-386c049422d2/discovery/0.log" Apr 21 16:37:45.901842 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:45.901808 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5598cc66fd-rczdm_45c5d172-f16d-4ebd-9d96-3a8ea68e126c/kube-auth-proxy/0.log" Apr 21 16:37:46.700765 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:46.700725 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc_71164a75-22e7-49a2-a405-4df52c142879/storage-initializer/0.log" Apr 21 16:37:46.708755 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:46.708734 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-brdcc_71164a75-22e7-49a2-a405-4df52c142879/main/0.log" Apr 21 16:37:46.814277 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:46.814240 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-lh778_fe45ed93-8e23-4a44-be3e-c93b37f3e6e4/storage-initializer/0.log" Apr 21 16:37:46.823490 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:46.823467 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-lh778_fe45ed93-8e23-4a44-be3e-c93b37f3e6e4/main/0.log" Apr 21 16:37:47.045964 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:47.045884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5_350e3117-8b4e-488d-a5e4-5223c951d17c/storage-initializer/0.log" Apr 21 16:37:47.053554 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:47.053523 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-rnvd5_350e3117-8b4e-488d-a5e4-5223c951d17c/main/0.log" Apr 21 16:37:53.941819 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:53.941790 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fmhv9_27182b2f-30c8-4cef-90b0-b91b4f04047a/global-pull-secret-syncer/0.log" Apr 21 16:37:54.108991 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:54.108957 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-729fq_f334a381-fb5b-413d-b975-0e2511378c95/konnectivity-agent/0.log" Apr 21 16:37:54.174424 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:54.174394 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-96.ec2.internal_ef7a67774dc3cec2ee7c5d8876aafbd3/haproxy/0.log" Apr 21 16:37:58.572975 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:58.572941 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-x956l_de92ee0c-6604-4ac8-82dd-fcc0ece17287/manager/0.log" Apr 21 16:37:58.791097 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:37:58.791067 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-nbzc2_aae08bd4-780e-45a7-b743-ad777540c26e/manager/0.log" Apr 21 16:38:01.107113 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:01.107080 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k7rdr_d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6/node-exporter/0.log" Apr 21 16:38:01.128188 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:01.128161 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k7rdr_d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6/kube-rbac-proxy/0.log" Apr 21 16:38:01.155193 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:01.155165 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-k7rdr_d927e11d-e8c5-4b2b-bdf5-e25a34f37ac6/init-textfile/0.log" Apr 21 16:38:02.133434 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.133395 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff"] Apr 21 16:38:02.136554 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.136534 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.139419 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.139385 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zttst\"/\"openshift-service-ca.crt\"" Apr 21 16:38:02.140344 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.140327 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zttst\"/\"default-dockercfg-568sq\"" Apr 21 16:38:02.140446 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.140360 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zttst\"/\"kube-root-ca.crt\"" Apr 21 16:38:02.150385 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.150363 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff"] Apr 21 16:38:02.232547 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.232516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-proc\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.232715 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.232569 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-podres\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.232715 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.232623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-sys\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.232715 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.232649 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-lib-modules\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.233092 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.232718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtmph\" (UniqueName: \"kubernetes.io/projected/15f43225-3c5b-4024-903a-a95877ab329e-kube-api-access-xtmph\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.333923 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.333877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtmph\" (UniqueName: \"kubernetes.io/projected/15f43225-3c5b-4024-903a-a95877ab329e-kube-api-access-xtmph\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.333923 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.333923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-proc\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.334166 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.333958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-podres\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.334166 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.333975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-sys\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.334166 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.333990 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-lib-modules\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.334166 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.334088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-proc\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.334166 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.334119 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-podres\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.334166 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.334110 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-lib-modules\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.334166 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.334112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/15f43225-3c5b-4024-903a-a95877ab329e-sys\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.345019 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.344999 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtmph\" (UniqueName: \"kubernetes.io/projected/15f43225-3c5b-4024-903a-a95877ab329e-kube-api-access-xtmph\") pod \"perf-node-gather-daemonset-f6vff\" (UID: \"15f43225-3c5b-4024-903a-a95877ab329e\") " pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.446564 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.446478 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:02.578680 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.578646 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff"] Apr 21 16:38:02.582024 ip-10-0-129-96 kubenswrapper[2577]: W0421 16:38:02.581999 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod15f43225_3c5b_4024_903a_a95877ab329e.slice/crio-4106576886e8db1561fd7403bef64c9651aa11f64fbeb52cdd9a0b68d0783ad4 WatchSource:0}: Error finding container 4106576886e8db1561fd7403bef64c9651aa11f64fbeb52cdd9a0b68d0783ad4: Status 404 returned error can't find the container with id 4106576886e8db1561fd7403bef64c9651aa11f64fbeb52cdd9a0b68d0783ad4 Apr 21 16:38:02.583725 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.583707 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 16:38:02.998881 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.998791 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" event={"ID":"15f43225-3c5b-4024-903a-a95877ab329e","Type":"ContainerStarted","Data":"0e093d0b041aa387782f4c5a32e6959f523bac629be208714cd0d4d20b0dcf99"} Apr 21 16:38:02.998881 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.998830 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" event={"ID":"15f43225-3c5b-4024-903a-a95877ab329e","Type":"ContainerStarted","Data":"4106576886e8db1561fd7403bef64c9651aa11f64fbeb52cdd9a0b68d0783ad4"} Apr 21 16:38:02.999092 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:02.998944 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:03.023593 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:03.023547 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" podStartSLOduration=1.023533142 podStartE2EDuration="1.023533142s" podCreationTimestamp="2026-04-21 16:38:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 16:38:03.022209451 +0000 UTC m=+2179.629111560" watchObservedRunningTime="2026-04-21 16:38:03.023533142 +0000 UTC m=+2179.630435308" Apr 21 16:38:05.352229 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:05.352196 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mp77f_22292ed9-e7f3-49e8-8973-30a2e2fa17a2/dns/0.log" Apr 21 16:38:05.371888 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:05.371865 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-mp77f_22292ed9-e7f3-49e8-8973-30a2e2fa17a2/kube-rbac-proxy/0.log" Apr 21 16:38:05.458861 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:05.458831 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-86kzc_4701d99b-4f05-4410-bf15-f5fd3e3bf5bf/dns-node-resolver/0.log" Apr 21 16:38:05.967485 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:05.967456 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8flks_d74dfe95-7664-4441-8273-cc22ee22d89f/node-ca/0.log" Apr 21 16:38:07.002794 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:07.002747 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-mkphv_4eda884f-b350-476a-b497-386c049422d2/discovery/0.log" Apr 21 16:38:07.034467 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:07.034436 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5598cc66fd-rczdm_45c5d172-f16d-4ebd-9d96-3a8ea68e126c/kube-auth-proxy/0.log" Apr 21 16:38:07.643348 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:07.643316 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bqkbv_89cebd20-5145-4196-8844-826bc1ec6662/serve-healthcheck-canary/0.log" Apr 21 16:38:08.261825 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:08.261750 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vfgtx_6d6f47c5-fbee-4306-a493-e6ac93a55dac/kube-rbac-proxy/0.log" Apr 21 16:38:08.280196 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:08.280160 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vfgtx_6d6f47c5-fbee-4306-a493-e6ac93a55dac/exporter/0.log" Apr 21 16:38:08.299523 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:08.299495 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vfgtx_6d6f47c5-fbee-4306-a493-e6ac93a55dac/extractor/0.log" Apr 21 16:38:09.011700 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:09.011667 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zttst/perf-node-gather-daemonset-f6vff" Apr 21 16:38:10.255345 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:10.255310 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-k56mb_af849393-15af-4721-953e-a91eb386709d/manager/0.log" Apr 21 16:38:10.375953 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:10.375922 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-bw97v_2423b3fc-41cc-40e4-956e-9f52f63198d3/manager/1.log" Apr 21 16:38:10.396286 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:10.396261 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-bw97v_2423b3fc-41cc-40e4-956e-9f52f63198d3/manager/2.log" Apr 21 16:38:10.491172 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:10.491139 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-774f54dc87-st7dm_26789298-0cab-4321-846d-66101be60413/manager/0.log" Apr 21 16:38:12.055632 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:12.055602 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-8lss4_af3c6353-f0c5-4a22-80bb-2e0a71253452/openshift-lws-operator/0.log" Apr 21 16:38:17.873897 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:17.873850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4r2h9_59ea63ef-850a-4d2b-867e-080e5b551a72/kube-multus/0.log" Apr 21 16:38:17.896423 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:17.896396 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jszj_7640279c-4346-4f46-b222-3c04e5d7569e/kube-multus-additional-cni-plugins/0.log" Apr 21 16:38:17.917318 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:17.917296 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jszj_7640279c-4346-4f46-b222-3c04e5d7569e/egress-router-binary-copy/0.log" Apr 21 16:38:17.936835 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:17.936806 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jszj_7640279c-4346-4f46-b222-3c04e5d7569e/cni-plugins/0.log" Apr 21 16:38:17.960375 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:17.960347 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jszj_7640279c-4346-4f46-b222-3c04e5d7569e/bond-cni-plugin/0.log" Apr 21 16:38:17.984625 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:17.984604 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jszj_7640279c-4346-4f46-b222-3c04e5d7569e/routeoverride-cni/0.log" Apr 21 16:38:18.019503 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:18.019475 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jszj_7640279c-4346-4f46-b222-3c04e5d7569e/whereabouts-cni-bincopy/0.log" Apr 21 16:38:18.046943 ip-10-0-129-96 kubenswrapper[2577]: I0421 16:38:18.046876 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8jszj_7640279c-4346-4f46-b222-3c04e5d7569e/whereabouts-cni/0.log"