Apr 24 21:24:53.309228 ip-10-0-130-31 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 21:24:53.309243 ip-10-0-130-31 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 21:24:53.309253 ip-10-0-130-31 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 21:24:53.309595 ip-10-0-130-31 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 21:25:03.331440 ip-10-0-130-31 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 21:25:03.331458 ip-10-0-130-31 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 458e00972b8e4a78a9773dd607b596f1 -- Apr 24 21:27:20.780302 ip-10-0-130-31 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:21.258010 ip-10-0-130-31 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:21.258010 ip-10-0-130-31 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:21.258010 ip-10-0-130-31 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:21.258010 ip-10-0-130-31 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:21.258010 ip-10-0-130-31 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:21.261768 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.261680 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:21.265984 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.265969 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.265985 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.265989 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.265992 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.265995 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.265999 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266002 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266006 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266010 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266013 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266030 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266034 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266037 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:21.266035 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266040 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266043 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266046 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266054 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266057 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266060 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266062 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266065 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266067 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266070 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266073 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266077 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266081 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266083 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266086 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266089 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266091 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266094 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266097 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266099 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:21.266352 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266103 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266105 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266108 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266111 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266113 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266116 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266118 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266121 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266123 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266126 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266128 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266130 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266133 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266135 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266138 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266141 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266144 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266146 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266149 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266151 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:21.266826 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266154 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266156 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266158 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266161 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266163 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266165 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266168 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266171 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266173 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266175 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266178 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266182 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266184 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266186 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266189 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266192 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266194 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266196 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266199 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:21.267323 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266202 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266204 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266206 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266209 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266211 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266213 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266216 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266219 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266222 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266225 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266227 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266230 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266232 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266241 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266632 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266637 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266640 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266642 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266645 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:21.267800 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266647 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266650 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266653 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266655 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266658 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266660 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266663 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266666 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266668 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266671 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266674 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266676 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266678 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266681 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266684 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266686 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266689 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266691 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266694 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266696 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:21.268354 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266700 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266702 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266705 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266708 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266710 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266712 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266715 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266717 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266720 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266722 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266725 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266727 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266729 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266732 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266734 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266737 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266740 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266742 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266745 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:21.268880 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266747 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266750 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266752 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266755 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266757 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266760 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266762 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266764 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266767 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266769 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266772 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266774 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266777 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266784 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266787 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266790 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266792 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266794 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266797 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:21.269378 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266799 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266802 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266804 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266807 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266809 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266812 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266815 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266817 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266819 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266822 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266824 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266827 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266830 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266833 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266837 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266841 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266846 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266850 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266853 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:21.269847 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266856 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266858 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266861 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.266863 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268330 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268340 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268346 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268351 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268356 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268360 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268365 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268370 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268373 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268376 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268379 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268382 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268386 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268388 2581 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268391 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268394 2581 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268397 2581 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268400 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268402 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268407 2581 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:21.270326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268410 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268413 2581 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268415 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268419 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268423 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268427 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268430 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268433 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268435 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268438 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268441 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268444 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268447 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268451 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268454 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268457 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268460 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268464 2581 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268467 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268472 2581 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268475 2581 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268478 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268481 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268484 2581 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268488 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:21.270909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268490 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268493 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268496 2581 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268499 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268502 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268504 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268507 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268510 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268513 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268516 2581 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268520 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268523 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268526 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268529 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268533 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268536 2581 flags.go:64] FLAG: --help="false" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268539 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268542 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268545 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268548 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268551 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268554 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268558 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:21.271524 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268560 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268563 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268566 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268569 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268572 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268575 2581 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268578 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268581 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268584 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268587 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268589 2581 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268592 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268595 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268598 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268603 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268606 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268608 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268611 2581 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268614 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268617 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268620 2581 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268623 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268627 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268631 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268636 2581 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:21.272104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268639 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268642 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268645 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268648 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268650 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268653 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268657 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268664 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268667 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268670 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268673 2581 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268676 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268682 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268685 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268688 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268691 2581 flags.go:64] FLAG: --port="10250" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268694 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268696 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-078487c979bf84e56" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268700 2581 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268702 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268705 2581 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268708 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268711 2581 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268714 2581 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:21.272711 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268717 2581 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268720 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268722 2581 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268726 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268729 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268732 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268735 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268738 2581 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268741 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268744 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268747 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268749 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268752 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268755 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268758 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268763 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268766 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268769 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268772 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268774 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268777 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268781 2581 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268784 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268789 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268792 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:21.273298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268795 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268799 2581 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268802 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268805 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268807 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268810 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268813 2581 flags.go:64] FLAG: --v="2" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268817 2581 flags.go:64] FLAG: --version="false" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268821 2581 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268825 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.268828 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268926 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268929 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268933 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268935 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268938 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268941 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268944 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268946 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268949 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268951 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268954 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268958 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:21.273935 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268960 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268963 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268966 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268968 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268971 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268974 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268976 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268979 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268982 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268984 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268987 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268989 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268992 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268994 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.268997 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269000 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269002 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269005 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269007 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269010 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:21.274528 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269012 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269015 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269033 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269036 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269038 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269042 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269044 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269047 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269049 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269053 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269056 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269060 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269063 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269065 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269068 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269070 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269073 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269075 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269078 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:21.275058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269081 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269083 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269086 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269088 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269091 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269093 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269096 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269101 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269103 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269106 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269108 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269111 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269114 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269118 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269121 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269124 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269126 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269129 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269132 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:21.275560 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269135 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269138 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269141 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269143 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269146 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269149 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269152 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269154 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269157 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269159 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269161 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269164 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269167 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269169 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269171 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:21.276047 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.269174 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:21.276429 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.269182 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:21.279356 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.279336 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:21.279395 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.279358 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:21.279427 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279408 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:21.279427 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279414 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:21.279427 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279417 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:21.279427 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279421 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:21.279427 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279424 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:21.279427 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279426 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:21.279427 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279429 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279434 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279455 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279458 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279461 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279464 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279468 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279471 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279474 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279477 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279480 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279482 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279485 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279487 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279490 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279492 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279495 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279497 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279500 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279502 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:21.279603 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279505 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279507 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279511 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279513 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279516 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279519 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279522 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279524 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279526 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279529 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279531 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279533 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279536 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279539 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279542 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279544 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279547 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279549 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279552 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:21.280118 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279554 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279557 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279559 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279562 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279564 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279567 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279569 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279572 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279574 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279577 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279579 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279581 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279584 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279586 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279589 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279591 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279594 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279596 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279600 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279604 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:21.280593 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279606 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279609 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279612 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279615 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279617 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279620 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279622 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279626 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279629 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279632 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279634 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279637 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279640 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279642 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279645 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279648 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279650 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279653 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279655 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279658 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:21.281156 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279660 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.279665 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279773 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279778 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279781 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279784 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279787 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279790 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279793 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279795 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279798 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279800 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279803 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279806 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279809 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279812 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:21.281651 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279814 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279817 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279819 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279823 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279826 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279829 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279832 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279834 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279836 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279839 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279841 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279844 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279846 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279849 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279852 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279855 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279857 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279860 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279862 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:21.282058 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279865 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279867 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279870 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279872 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279875 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279879 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279882 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279885 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279888 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279890 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279893 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279896 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279898 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279901 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279903 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279906 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279908 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279911 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279914 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:21.282521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279916 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279919 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279921 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279924 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279926 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279928 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279931 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279933 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279936 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279938 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279941 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279943 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279945 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279948 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279950 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279953 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279955 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279957 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279960 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279962 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:21.282977 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279964 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279967 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279969 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279971 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279974 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279977 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279979 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279982 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279984 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279986 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279989 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279992 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279994 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:21.279997 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.280002 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:21.283521 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.280915 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:21.283961 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.283392 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:21.284300 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.284289 2581 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:21.284416 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.284398 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:21.284466 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.284450 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:21.314015 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.313992 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:21.321650 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.321626 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:21.336959 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.336936 2581 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:21.344072 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.344052 2581 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:21.345539 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.345525 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:21.345979 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.345965 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:21.351919 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.351895 2581 fs.go:135] Filesystem UUIDs: map[43ef5758-2c4a-47ff-b3b5-4327095b681a:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f3eb8fdd-14e9-425a-a477-8f1a56474553:/dev/nvme0n1p3] Apr 24 21:27:21.352008 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.351918 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:21.358765 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.358644 2581 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:21.356313752 +0000 UTC m=+0.465407227 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100105 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2afbcc2187ddbb99cfe520708563f4 SystemUUID:ec2afbcc-2187-ddbb-99cf-e520708563f4 BootID:458e0097-2b8e-4a78-a977-3dd607b596f1 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:d0:86:70:27:79 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:d0:86:70:27:79 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:86:86:cf:e1:c6:0f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:21.358765 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.358753 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:21.358938 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.358886 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:21.360155 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.360131 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:21.360334 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.360159 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-31.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:21.360420 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.360350 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:21.360420 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.360363 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:21.360420 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.360382 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:21.360420 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.360408 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:21.362395 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.362381 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:21.362540 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.362528 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:21.365373 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.365361 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:21.365439 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.365385 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:21.365439 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.365402 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:21.365439 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.365415 2581 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:21.365608 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.365451 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:21.366667 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.366651 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:21.366731 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.366699 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:21.370707 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.370689 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:21.372760 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.372744 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:21.373305 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.373290 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2vlq8" Apr 24 21:27:21.374861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374849 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:21.374899 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374867 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:21.374899 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374873 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:21.374899 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374879 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:21.374899 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374885 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:21.374899 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374890 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:21.374899 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374896 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:21.374899 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374901 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:21.375100 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374909 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:21.375100 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374915 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:21.375100 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374923 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:21.375100 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.374932 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:21.375954 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.375941 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:21.375986 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.375958 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:21.377854 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.377827 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:21.378054 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.378001 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-31.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:21.380236 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.380223 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:21.380285 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.380258 2581 server.go:1295] "Started kubelet" Apr 24 21:27:21.380365 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.380332 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:21.380471 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.380342 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:21.380471 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.380398 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:21.381121 ip-10-0-130-31 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:21.381955 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.381721 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2vlq8" Apr 24 21:27:21.381955 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.381752 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:21.383244 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.383226 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:21.390000 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.389981 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:21.390658 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.390638 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:21.391388 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.391369 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:21.391388 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.391391 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:21.391532 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.391426 2581 factory.go:55] Registering systemd factory Apr 24 21:27:21.391532 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.391487 2581 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:21.391532 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.391489 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:21.391656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.391545 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:21.391656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.391554 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:21.391758 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.391652 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:21.392004 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.391987 2581 factory.go:153] Registering CRI-O factory Apr 24 21:27:21.392089 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.392007 2581 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:21.392089 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.392079 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:21.392189 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.392108 2581 factory.go:103] Registering Raw factory Apr 24 21:27:21.392189 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.392124 2581 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:21.392332 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.392284 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:21.392641 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.392621 2581 manager.go:319] Starting recovery of all containers Apr 24 21:27:21.392775 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.392754 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-31.ec2.internal" not found Apr 24 21:27:21.392896 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.392882 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:21.395359 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.395339 2581 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-31.ec2.internal\" not found" node="ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.404169 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.404145 2581 manager.go:324] Recovery completed Apr 24 21:27:21.405391 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.405366 2581 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 24 21:27:21.408294 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.408281 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:21.410358 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.410344 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-31.ec2.internal" not found Apr 24 21:27:21.411054 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.411037 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:21.411129 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.411064 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:21.411129 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.411076 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:21.411597 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.411584 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:21.411659 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.411598 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:21.411659 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.411618 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:21.414656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.414645 2581 policy_none.go:49] "None policy: Start" Apr 24 21:27:21.414699 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.414669 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:21.414699 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.414678 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:21.455663 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.455646 2581 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:21.458687 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.455755 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:21.458687 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.455772 2581 server.go:85] "Starting device plugin registration server" Apr 24 21:27:21.458687 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.456008 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:21.458687 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.456046 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:21.458687 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.456138 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:21.458687 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.456200 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:21.458687 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.456210 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:21.458687 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.456700 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:21.458687 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.456736 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:21.468745 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.468728 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-31.ec2.internal" not found Apr 24 21:27:21.516865 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.516781 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:21.518151 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.518134 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:21.518212 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.518167 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:21.518212 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.518191 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:21.518212 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.518200 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:21.518322 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.518242 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:21.523660 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.523639 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:21.556453 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.556432 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:21.557551 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.557534 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:21.557645 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.557563 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:21.557645 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.557574 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:21.557645 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.557597 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.573246 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.573221 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.573246 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.573247 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-31.ec2.internal\": node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:21.593083 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.593063 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:21.619164 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.619127 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal"] Apr 24 21:27:21.619245 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.619210 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:21.620107 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.620092 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:21.620179 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.620120 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:21.620179 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.620130 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:21.621214 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.621201 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:21.621356 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.621342 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.621396 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.621376 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:21.622099 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.622083 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:21.622147 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.622114 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:21.622147 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.622123 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:21.622212 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.622083 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:21.622212 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.622185 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:21.622212 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.622199 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:21.623013 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.622998 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.623112 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.623035 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:21.623700 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.623675 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:21.623776 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.623712 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:21.623776 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.623721 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:21.652827 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.652806 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-31.ec2.internal\" not found" node="ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.657632 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.657615 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-31.ec2.internal\" not found" node="ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.692969 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.692945 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/72565ecde5f90cfca62d0623671f5130-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal\" (UID: \"72565ecde5f90cfca62d0623671f5130\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.693088 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.692975 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72565ecde5f90cfca62d0623671f5130-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal\" (UID: \"72565ecde5f90cfca62d0623671f5130\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.693088 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.692999 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/245bce5339b00b6e9cfc0086658d8fb7-config\") pod \"kube-apiserver-proxy-ip-10-0-130-31.ec2.internal\" (UID: \"245bce5339b00b6e9cfc0086658d8fb7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.693952 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.693934 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:21.793183 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.793111 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/72565ecde5f90cfca62d0623671f5130-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal\" (UID: \"72565ecde5f90cfca62d0623671f5130\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.793183 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.793143 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72565ecde5f90cfca62d0623671f5130-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal\" (UID: \"72565ecde5f90cfca62d0623671f5130\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.793183 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.793159 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/245bce5339b00b6e9cfc0086658d8fb7-config\") pod \"kube-apiserver-proxy-ip-10-0-130-31.ec2.internal\" (UID: \"245bce5339b00b6e9cfc0086658d8fb7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.793352 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.793193 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72565ecde5f90cfca62d0623671f5130-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal\" (UID: \"72565ecde5f90cfca62d0623671f5130\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.793352 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.793207 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/72565ecde5f90cfca62d0623671f5130-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal\" (UID: \"72565ecde5f90cfca62d0623671f5130\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.793352 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.793224 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/245bce5339b00b6e9cfc0086658d8fb7-config\") pod \"kube-apiserver-proxy-ip-10-0-130-31.ec2.internal\" (UID: \"245bce5339b00b6e9cfc0086658d8fb7\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.794137 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.794114 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:21.894666 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.894633 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:21.954840 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.954809 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.960309 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:21.960294 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal" Apr 24 21:27:21.995663 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:21.995618 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:22.096126 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:22.096058 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:22.196617 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:22.196581 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:22.283760 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.283724 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:22.284378 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.283856 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:22.284378 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.283892 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:22.297089 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:22.297067 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:22.365418 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.365391 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:22.384567 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.384523 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:21 +0000 UTC" deadline="2027-09-28 10:13:06.769205694 +0000 UTC" Apr 24 21:27:22.384567 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.384564 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12516h45m44.384645515s" Apr 24 21:27:22.390694 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.390672 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:22.398675 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:22.398653 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:22.402793 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.402774 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:22.426142 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.426121 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-x7wcv" Apr 24 21:27:22.432159 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.432143 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-x7wcv" Apr 24 21:27:22.464530 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:22.464491 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod245bce5339b00b6e9cfc0086658d8fb7.slice/crio-14bd32aaf670df4e70848d455d8830fb9b388d34245d227ad712c166c25d091f WatchSource:0}: Error finding container 14bd32aaf670df4e70848d455d8830fb9b388d34245d227ad712c166c25d091f: Status 404 returned error can't find the container with id 14bd32aaf670df4e70848d455d8830fb9b388d34245d227ad712c166c25d091f Apr 24 21:27:22.471048 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.470961 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:22.476845 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:22.476822 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72565ecde5f90cfca62d0623671f5130.slice/crio-c5a20ff023a52eeb00f30ca42a27ce5c3607d27cfa06e9e419c10ca5be90afe7 WatchSource:0}: Error finding container c5a20ff023a52eeb00f30ca42a27ce5c3607d27cfa06e9e419c10ca5be90afe7: Status 404 returned error can't find the container with id c5a20ff023a52eeb00f30ca42a27ce5c3607d27cfa06e9e419c10ca5be90afe7 Apr 24 21:27:22.499488 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:22.499466 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:22.521335 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.521285 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" event={"ID":"72565ecde5f90cfca62d0623671f5130","Type":"ContainerStarted","Data":"c5a20ff023a52eeb00f30ca42a27ce5c3607d27cfa06e9e419c10ca5be90afe7"} Apr 24 21:27:22.522268 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.522249 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal" event={"ID":"245bce5339b00b6e9cfc0086658d8fb7","Type":"ContainerStarted","Data":"14bd32aaf670df4e70848d455d8830fb9b388d34245d227ad712c166c25d091f"} Apr 24 21:27:22.599827 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:22.599798 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:22.700374 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:22.700272 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:22.800862 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:22.800835 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-31.ec2.internal\" not found" Apr 24 21:27:22.825776 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.825751 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:22.891771 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.891732 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" Apr 24 21:27:22.903207 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.903089 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:22.903957 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.903928 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal" Apr 24 21:27:22.912929 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:22.912837 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:23.148390 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.148304 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:23.190618 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.190587 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:23.366095 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.366058 2581 apiserver.go:52] "Watching apiserver" Apr 24 21:27:23.372747 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.372722 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:23.374585 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.374561 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-h4z9c","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw","openshift-dns/node-resolver-hjwlf","openshift-image-registry/node-ca-kplsb","openshift-multus/multus-2ptsg","openshift-network-operator/iptables-alerter-45fvz","openshift-ovn-kubernetes/ovnkube-node-8j4mf","kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal","openshift-cluster-node-tuning-operator/tuned-lnqjv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal","openshift-multus/multus-additional-cni-plugins-ps7q2","openshift-multus/network-metrics-daemon-489tz","openshift-network-diagnostics/network-check-target-2f65f"] Apr 24 21:27:23.377561 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.377537 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.379881 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.379805 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.379987 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.379901 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:23.379987 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.379934 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:23.380119 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.380104 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:23.380609 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.380592 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:23.380715 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.380594 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:23.382036 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.381862 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:23.383320 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.383153 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:23.383320 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.383210 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:23.383320 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.383223 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:23.383320 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.383257 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-j52cm\"" Apr 24 21:27:23.383320 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.383291 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-74bs9\"" Apr 24 21:27:23.385040 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.384223 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.385040 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.384338 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.386408 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.386392 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8knfc\"" Apr 24 21:27:23.386496 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.386448 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:23.386599 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.386572 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:27:23.386726 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.386629 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:23.386726 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.386637 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-gm5vw\"" Apr 24 21:27:23.386726 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.386398 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:27:23.386915 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.386852 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:23.389530 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.389153 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.389530 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.389260 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.397319 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.397299 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:23.397520 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.397504 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:23.397945 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.397924 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:23.398195 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.398171 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vmqdm\"" Apr 24 21:27:23.398365 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.398347 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:23.398453 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.398409 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:23.398529 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.398514 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:23.398529 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.398410 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:23.398633 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.398521 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-44d8l\"" Apr 24 21:27:23.399597 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.399571 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.401525 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.401507 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:23.401695 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.401675 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-m2s9z\"" Apr 24 21:27:23.401766 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.401701 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:23.401951 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.401924 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-sysctl-d\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.402041 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.401953 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m724r\" (UniqueName: \"kubernetes.io/projected/84da9595-3baf-4bee-854d-b2858b093de3-kube-api-access-m724r\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.402041 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.401980 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db3d63f4-067a-47a5-b441-e08cbb119ecd-ovnkube-script-lib\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.402041 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402004 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kql94\" (UniqueName: \"kubernetes.io/projected/db3d63f4-067a-47a5-b441-e08cbb119ecd-kube-api-access-kql94\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.402292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402047 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49426\" (UniqueName: \"kubernetes.io/projected/4b00449f-18b5-4507-83ec-4a003e10f7fb-kube-api-access-49426\") pod \"node-ca-kplsb\" (UID: \"4b00449f-18b5-4507-83ec-4a003e10f7fb\") " pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.402292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402071 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-registration-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.402292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402112 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-var-lib-cni-bin\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.402292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402139 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-hostroot\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.402292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402166 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/61331cbf-bfdf-44cd-895b-21d09c03e3a3-agent-certs\") pod \"konnectivity-agent-h4z9c\" (UID: \"61331cbf-bfdf-44cd-895b-21d09c03e3a3\") " pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:23.402292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402189 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db3d63f4-067a-47a5-b441-e08cbb119ecd-ovnkube-config\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.402292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402213 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-etc-selinux\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.402292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402239 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-run-k8s-cni-cncf-io\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.402292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402262 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-run-netns\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.402292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402285 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-run-netns\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402310 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-device-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402335 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-multus-cni-dir\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402357 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cb4c0791-332c-4626-884a-8947b04761c9-multus-daemon-config\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402391 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-cni-netd\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402416 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402442 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzzxv\" (UniqueName: \"kubernetes.io/projected/cb4c0791-332c-4626-884a-8947b04761c9-kube-api-access-hzzxv\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402515 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-var-lib-kubelet\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402540 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-systemd-units\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402563 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-socket-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402608 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-multus-conf-dir\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402640 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/61331cbf-bfdf-44cd-895b-21d09c03e3a3-konnectivity-ca\") pod \"konnectivity-agent-h4z9c\" (UID: \"61331cbf-bfdf-44cd-895b-21d09c03e3a3\") " pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:23.402749 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402659 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-modprobe-d\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402821 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-run-systemd\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402883 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-node-log\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402910 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ad14c53e-e5b6-4cbb-9e60-af19eb6027a6-iptables-alerter-script\") pod \"iptables-alerter-45fvz\" (UID: \"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6\") " pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402926 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.402974 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403005 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-os-release\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403056 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cb4c0791-332c-4626-884a-8947b04761c9-cni-binary-copy\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403106 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-var-lib-kubelet\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403143 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-run-multus-certs\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403191 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-kubernetes\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403216 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/964dbac9-11de-44a8-b2ea-152ca4914413-hosts-file\") pod \"node-resolver-hjwlf\" (UID: \"964dbac9-11de-44a8-b2ea-152ca4914413\") " pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403242 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-system-cni-dir\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.403307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403282 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-sysconfig\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403321 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-host\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403347 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db3d63f4-067a-47a5-b441-e08cbb119ecd-env-overrides\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403380 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad14c53e-e5b6-4cbb-9e60-af19eb6027a6-host-slash\") pod \"iptables-alerter-45fvz\" (UID: \"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6\") " pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403407 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-run\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403425 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-lib-modules\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403439 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-etc-openvswitch\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403461 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nngrj\" (UniqueName: \"kubernetes.io/projected/e844355b-199a-4f53-993b-84603868363e-kube-api-access-nngrj\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403481 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-sysctl-conf\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403498 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-systemd\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403520 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-run-openvswitch\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403553 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/964dbac9-11de-44a8-b2ea-152ca4914413-tmp-dir\") pod \"node-resolver-hjwlf\" (UID: \"964dbac9-11de-44a8-b2ea-152ca4914413\") " pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403578 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/84da9595-3baf-4bee-854d-b2858b093de3-etc-tuned\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403610 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-slash\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403637 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-log-socket\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403662 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-var-lib-openvswitch\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403692 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-run-ovn\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.403861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403718 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-multus-socket-dir-parent\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403742 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-sys\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403763 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84da9595-3baf-4bee-854d-b2858b093de3-tmp\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403785 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-kubelet\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403803 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7rcw\" (UniqueName: \"kubernetes.io/projected/964dbac9-11de-44a8-b2ea-152ca4914413-kube-api-access-l7rcw\") pod \"node-resolver-hjwlf\" (UID: \"964dbac9-11de-44a8-b2ea-152ca4914413\") " pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403830 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b00449f-18b5-4507-83ec-4a003e10f7fb-serviceca\") pod \"node-ca-kplsb\" (UID: \"4b00449f-18b5-4507-83ec-4a003e10f7fb\") " pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403854 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db3d63f4-067a-47a5-b441-e08cbb119ecd-ovn-node-metrics-cert\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403887 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b00449f-18b5-4507-83ec-4a003e10f7fb-host\") pod \"node-ca-kplsb\" (UID: \"4b00449f-18b5-4507-83ec-4a003e10f7fb\") " pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403909 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-var-lib-cni-multus\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403930 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-etc-kubernetes\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403954 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-run-ovn-kubernetes\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.403976 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-cni-bin\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.404000 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knz86\" (UniqueName: \"kubernetes.io/projected/ad14c53e-e5b6-4cbb-9e60-af19eb6027a6-kube-api-access-knz86\") pod \"iptables-alerter-45fvz\" (UID: \"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6\") " pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.404044 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-sys-fs\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.404670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.404066 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-cnibin\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.405489 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.405064 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6c2sv\"" Apr 24 21:27:23.405489 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.405233 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:23.405489 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.405360 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:23.405489 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.405486 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.407610 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.407580 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:23.407709 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.407681 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:23.407709 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.407696 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:23.407816 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:23.407752 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:23.408126 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.408107 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:23.408206 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.408113 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l82zb\"" Apr 24 21:27:23.408423 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:23.407754 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:23.432730 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.432703 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:22 +0000 UTC" deadline="2027-10-30 12:40:00.300829921 +0000 UTC" Apr 24 21:27:23.432730 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.432728 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13287h12m36.868104167s" Apr 24 21:27:23.493412 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.493390 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:23.504724 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504696 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-host\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.504872 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504728 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db3d63f4-067a-47a5-b441-e08cbb119ecd-env-overrides\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.504872 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504767 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad14c53e-e5b6-4cbb-9e60-af19eb6027a6-host-slash\") pod \"iptables-alerter-45fvz\" (UID: \"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6\") " pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.504872 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504777 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-host\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.504872 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504788 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-run\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.504872 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504835 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-lib-modules\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.504872 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504853 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ad14c53e-e5b6-4cbb-9e60-af19eb6027a6-host-slash\") pod \"iptables-alerter-45fvz\" (UID: \"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6\") " pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504857 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-etc-openvswitch\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504891 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-etc-openvswitch\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504912 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nngrj\" (UniqueName: \"kubernetes.io/projected/e844355b-199a-4f53-993b-84603868363e-kube-api-access-nngrj\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504923 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-run\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504941 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-sysctl-conf\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504969 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-systemd\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504990 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-lib-modules\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.504996 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-run-openvswitch\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505050 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-run-openvswitch\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505058 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/964dbac9-11de-44a8-b2ea-152ca4914413-tmp-dir\") pod \"node-resolver-hjwlf\" (UID: \"964dbac9-11de-44a8-b2ea-152ca4914413\") " pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505095 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505113 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-systemd\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505124 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-sysctl-conf\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.505154 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505125 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/84da9595-3baf-4bee-854d-b2858b093de3-etc-tuned\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505182 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-slash\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505212 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-log-socket\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505237 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-var-lib-openvswitch\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505264 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-run-ovn\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505289 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-multus-socket-dir-parent\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505314 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-sys\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505315 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/db3d63f4-067a-47a5-b441-e08cbb119ecd-env-overrides\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505339 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84da9595-3baf-4bee-854d-b2858b093de3-tmp\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505365 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-kubelet\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505331 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-log-socket\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505399 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/964dbac9-11de-44a8-b2ea-152ca4914413-tmp-dir\") pod \"node-resolver-hjwlf\" (UID: \"964dbac9-11de-44a8-b2ea-152ca4914413\") " pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505411 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505458 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-kubelet\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505401 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7rcw\" (UniqueName: \"kubernetes.io/projected/964dbac9-11de-44a8-b2ea-152ca4914413-kube-api-access-l7rcw\") pod \"node-resolver-hjwlf\" (UID: \"964dbac9-11de-44a8-b2ea-152ca4914413\") " pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505480 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-multus-socket-dir-parent\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505501 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b00449f-18b5-4507-83ec-4a003e10f7fb-serviceca\") pod \"node-ca-kplsb\" (UID: \"4b00449f-18b5-4507-83ec-4a003e10f7fb\") " pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505515 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-sys\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.505769 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505548 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db3d63f4-067a-47a5-b441-e08cbb119ecd-ovn-node-metrics-cert\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505554 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-run-ovn\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505471 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-var-lib-openvswitch\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505367 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-slash\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505605 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b00449f-18b5-4507-83ec-4a003e10f7fb-host\") pod \"node-ca-kplsb\" (UID: \"4b00449f-18b5-4507-83ec-4a003e10f7fb\") " pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505683 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f4586cbe-e12f-4084-9d26-5a60d4858635-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505730 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-system-cni-dir\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505748 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b00449f-18b5-4507-83ec-4a003e10f7fb-host\") pod \"node-ca-kplsb\" (UID: \"4b00449f-18b5-4507-83ec-4a003e10f7fb\") " pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505810 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjkr6\" (UniqueName: \"kubernetes.io/projected/5dfd7cf1-e10a-410e-b412-be269391a904-kube-api-access-mjkr6\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505842 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-var-lib-cni-multus\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505870 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-os-release\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505878 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b00449f-18b5-4507-83ec-4a003e10f7fb-serviceca\") pod \"node-ca-kplsb\" (UID: \"4b00449f-18b5-4507-83ec-4a003e10f7fb\") " pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505899 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-etc-kubernetes\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505904 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-var-lib-cni-multus\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505930 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-run-ovn-kubernetes\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505959 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-etc-kubernetes\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.505964 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-cni-bin\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.506632 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506004 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-knz86\" (UniqueName: \"kubernetes.io/projected/ad14c53e-e5b6-4cbb-9e60-af19eb6027a6-kube-api-access-knz86\") pod \"iptables-alerter-45fvz\" (UID: \"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6\") " pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506060 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-sys-fs\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506008 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-cni-bin\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506090 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-cnibin\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506117 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-sysctl-d\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506142 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-sys-fs\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506148 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m724r\" (UniqueName: \"kubernetes.io/projected/84da9595-3baf-4bee-854d-b2858b093de3-kube-api-access-m724r\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506178 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db3d63f4-067a-47a5-b441-e08cbb119ecd-ovnkube-script-lib\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506205 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kql94\" (UniqueName: \"kubernetes.io/projected/db3d63f4-067a-47a5-b441-e08cbb119ecd-kube-api-access-kql94\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506205 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-cnibin\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506062 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-run-ovn-kubernetes\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506229 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49426\" (UniqueName: \"kubernetes.io/projected/4b00449f-18b5-4507-83ec-4a003e10f7fb-kube-api-access-49426\") pod \"node-ca-kplsb\" (UID: \"4b00449f-18b5-4507-83ec-4a003e10f7fb\") " pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506253 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-registration-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506286 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-var-lib-cni-bin\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506472 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-hostroot\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506510 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/61331cbf-bfdf-44cd-895b-21d09c03e3a3-agent-certs\") pod \"konnectivity-agent-h4z9c\" (UID: \"61331cbf-bfdf-44cd-895b-21d09c03e3a3\") " pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506541 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db3d63f4-067a-47a5-b441-e08cbb119ecd-ovnkube-config\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.507341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506546 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-sysctl-d\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506571 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-etc-selinux\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506598 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-run-k8s-cni-cncf-io\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506657 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-run-netns\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506699 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-run-netns\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506728 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506759 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-device-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506788 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-multus-cni-dir\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506818 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cb4c0791-332c-4626-884a-8947b04761c9-multus-daemon-config\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.506845 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-cni-netd\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.507832 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.507905 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-cnibin\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.508065 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-registration-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.508133 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.508124 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-var-lib-cni-bin\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.508671 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.508179 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-hostroot\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.508879 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.508787 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/db3d63f4-067a-47a5-b441-e08cbb119ecd-ovnkube-script-lib\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509197 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/db3d63f4-067a-47a5-b441-e08cbb119ecd-ovn-node-metrics-cert\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.508789 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-device-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509471 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/84da9595-3baf-4bee-854d-b2858b093de3-tmp\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509483 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/db3d63f4-067a-47a5-b441-e08cbb119ecd-ovnkube-config\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509539 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-run-k8s-cni-cncf-io\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.507961 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klkst\" (UniqueName: \"kubernetes.io/projected/f4586cbe-e12f-4084-9d26-5a60d4858635-kube-api-access-klkst\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509631 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-run-netns\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509635 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-etc-selinux\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509676 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-run-netns\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509648 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzzxv\" (UniqueName: \"kubernetes.io/projected/cb4c0791-332c-4626-884a-8947b04761c9-kube-api-access-hzzxv\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509781 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-multus-cni-dir\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509921 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-var-lib-kubelet\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.509974 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/84da9595-3baf-4bee-854d-b2858b093de3-etc-tuned\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510002 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510041 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/cb4c0791-332c-4626-884a-8947b04761c9-multus-daemon-config\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510082 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-systemd-units\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.512966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510088 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-var-lib-kubelet\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510146 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4586cbe-e12f-4084-9d26-5a60d4858635-cni-binary-copy\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510189 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-systemd-units\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510209 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5qkm\" (UniqueName: \"kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm\") pod \"network-check-target-2f65f\" (UID: \"acca6a48-f7ff-4dec-82df-945011bc308d\") " pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510261 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-socket-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510293 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-multus-conf-dir\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510473 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/61331cbf-bfdf-44cd-895b-21d09c03e3a3-konnectivity-ca\") pod \"konnectivity-agent-h4z9c\" (UID: \"61331cbf-bfdf-44cd-895b-21d09c03e3a3\") " pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510415 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-multus-conf-dir\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511123 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/61331cbf-bfdf-44cd-895b-21d09c03e3a3-konnectivity-ca\") pod \"konnectivity-agent-h4z9c\" (UID: \"61331cbf-bfdf-44cd-895b-21d09c03e3a3\") " pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511262 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-modprobe-d\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.510418 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-socket-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511315 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/61331cbf-bfdf-44cd-895b-21d09c03e3a3-agent-certs\") pod \"konnectivity-agent-h4z9c\" (UID: \"61331cbf-bfdf-44cd-895b-21d09c03e3a3\") " pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511326 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-host-cni-netd\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511376 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-modprobe-d\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511380 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-run-systemd\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511408 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-node-log\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511433 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ad14c53e-e5b6-4cbb-9e60-af19eb6027a6-iptables-alerter-script\") pod \"iptables-alerter-45fvz\" (UID: \"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6\") " pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.513837 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511471 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-run-systemd\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511499 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/db3d63f4-067a-47a5-b441-e08cbb119ecd-node-log\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511542 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4586cbe-e12f-4084-9d26-5a60d4858635-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511617 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511652 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-os-release\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511704 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e844355b-199a-4f53-993b-84603868363e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511696 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cb4c0791-332c-4626-884a-8947b04761c9-cni-binary-copy\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511762 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-var-lib-kubelet\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511778 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-os-release\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511812 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-run-multus-certs\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511844 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-kubernetes\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511874 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/964dbac9-11de-44a8-b2ea-152ca4914413-hosts-file\") pod \"node-resolver-hjwlf\" (UID: \"964dbac9-11de-44a8-b2ea-152ca4914413\") " pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.511936 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-system-cni-dir\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.512002 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-sysconfig\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.512134 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ad14c53e-e5b6-4cbb-9e60-af19eb6027a6-iptables-alerter-script\") pod \"iptables-alerter-45fvz\" (UID: \"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6\") " pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.512195 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-run-multus-certs\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.512250 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-sysconfig\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.512267 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/84da9595-3baf-4bee-854d-b2858b093de3-etc-kubernetes\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.514628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.512310 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-host-var-lib-kubelet\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.515491 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.512321 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cb4c0791-332c-4626-884a-8947b04761c9-cni-binary-copy\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.515491 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.512369 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/964dbac9-11de-44a8-b2ea-152ca4914413-hosts-file\") pod \"node-resolver-hjwlf\" (UID: \"964dbac9-11de-44a8-b2ea-152ca4914413\") " pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.515491 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.512397 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cb4c0791-332c-4626-884a-8947b04761c9-system-cni-dir\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.521272 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.521232 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nngrj\" (UniqueName: \"kubernetes.io/projected/e844355b-199a-4f53-993b-84603868363e-kube-api-access-nngrj\") pod \"aws-ebs-csi-driver-node-x5lzw\" (UID: \"e844355b-199a-4f53-993b-84603868363e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.522104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.522070 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-knz86\" (UniqueName: \"kubernetes.io/projected/ad14c53e-e5b6-4cbb-9e60-af19eb6027a6-kube-api-access-knz86\") pod \"iptables-alerter-45fvz\" (UID: \"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6\") " pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.522383 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.522335 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7rcw\" (UniqueName: \"kubernetes.io/projected/964dbac9-11de-44a8-b2ea-152ca4914413-kube-api-access-l7rcw\") pod \"node-resolver-hjwlf\" (UID: \"964dbac9-11de-44a8-b2ea-152ca4914413\") " pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.522467 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.522452 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzzxv\" (UniqueName: \"kubernetes.io/projected/cb4c0791-332c-4626-884a-8947b04761c9-kube-api-access-hzzxv\") pod \"multus-2ptsg\" (UID: \"cb4c0791-332c-4626-884a-8947b04761c9\") " pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.522592 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.522569 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m724r\" (UniqueName: \"kubernetes.io/projected/84da9595-3baf-4bee-854d-b2858b093de3-kube-api-access-m724r\") pod \"tuned-lnqjv\" (UID: \"84da9595-3baf-4bee-854d-b2858b093de3\") " pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.522818 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.522795 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kql94\" (UniqueName: \"kubernetes.io/projected/db3d63f4-067a-47a5-b441-e08cbb119ecd-kube-api-access-kql94\") pod \"ovnkube-node-8j4mf\" (UID: \"db3d63f4-067a-47a5-b441-e08cbb119ecd\") " pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.523404 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.523383 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49426\" (UniqueName: \"kubernetes.io/projected/4b00449f-18b5-4507-83ec-4a003e10f7fb-kube-api-access-49426\") pod \"node-ca-kplsb\" (UID: \"4b00449f-18b5-4507-83ec-4a003e10f7fb\") " pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612367 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f4586cbe-e12f-4084-9d26-5a60d4858635-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612407 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-system-cni-dir\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612434 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjkr6\" (UniqueName: \"kubernetes.io/projected/5dfd7cf1-e10a-410e-b412-be269391a904-kube-api-access-mjkr6\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612460 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-os-release\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612561 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612580 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-os-release\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612616 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-cnibin\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612648 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klkst\" (UniqueName: \"kubernetes.io/projected/f4586cbe-e12f-4084-9d26-5a60d4858635-kube-api-access-klkst\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612658 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-cnibin\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612661 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-system-cni-dir\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612675 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4586cbe-e12f-4084-9d26-5a60d4858635-cni-binary-copy\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612700 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qkm\" (UniqueName: \"kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm\") pod \"network-check-target-2f65f\" (UID: \"acca6a48-f7ff-4dec-82df-945011bc308d\") " pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612730 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4586cbe-e12f-4084-9d26-5a60d4858635-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612743 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f4586cbe-e12f-4084-9d26-5a60d4858635-tuning-conf-dir\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612770 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:23.612922 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:23.613296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.612939 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f4586cbe-e12f-4084-9d26-5a60d4858635-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.614122 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:23.612995 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs podName:5dfd7cf1-e10a-410e-b412-be269391a904 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:24.112972809 +0000 UTC m=+3.222066288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs") pod "network-metrics-daemon-489tz" (UID: "5dfd7cf1-e10a-410e-b412-be269391a904") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:23.614122 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.613246 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f4586cbe-e12f-4084-9d26-5a60d4858635-cni-binary-copy\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.614122 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.613956 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f4586cbe-e12f-4084-9d26-5a60d4858635-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.621677 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:23.621655 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:23.621677 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:23.621674 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:23.621879 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:23.621687 2581 projected.go:194] Error preparing data for projected volume kube-api-access-k5qkm for pod openshift-network-diagnostics/network-check-target-2f65f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:23.621879 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:23.621809 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm podName:acca6a48-f7ff-4dec-82df-945011bc308d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:24.121794751 +0000 UTC m=+3.230888232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-k5qkm" (UniqueName: "kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm") pod "network-check-target-2f65f" (UID: "acca6a48-f7ff-4dec-82df-945011bc308d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:23.624531 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.624504 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjkr6\" (UniqueName: \"kubernetes.io/projected/5dfd7cf1-e10a-410e-b412-be269391a904-kube-api-access-mjkr6\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:23.625047 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.625002 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klkst\" (UniqueName: \"kubernetes.io/projected/f4586cbe-e12f-4084-9d26-5a60d4858635-kube-api-access-klkst\") pod \"multus-additional-cni-plugins-ps7q2\" (UID: \"f4586cbe-e12f-4084-9d26-5a60d4858635\") " pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:23.690515 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.690426 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:23.701483 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.701442 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" Apr 24 21:27:23.708251 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.708232 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hjwlf" Apr 24 21:27:23.715608 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.715578 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kplsb" Apr 24 21:27:23.722206 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.722184 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2ptsg" Apr 24 21:27:23.729772 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.729756 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-45fvz" Apr 24 21:27:23.735378 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.735360 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" Apr 24 21:27:23.741892 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.741875 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:23.746349 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:23.746329 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" Apr 24 21:27:24.089664 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:24.089515 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61331cbf_bfdf_44cd_895b_21d09c03e3a3.slice/crio-1ce3148ba0b443fc1439da503cea0e28c582abdc147bb4d7c2cd3a35d89fc714 WatchSource:0}: Error finding container 1ce3148ba0b443fc1439da503cea0e28c582abdc147bb4d7c2cd3a35d89fc714: Status 404 returned error can't find the container with id 1ce3148ba0b443fc1439da503cea0e28c582abdc147bb4d7c2cd3a35d89fc714 Apr 24 21:27:24.090962 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:24.090873 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode844355b_199a_4f53_993b_84603868363e.slice/crio-a5a516befaa29943cc85d46419c6ffc35c432e73fbbf7fa3a17133670a8ec05c WatchSource:0}: Error finding container a5a516befaa29943cc85d46419c6ffc35c432e73fbbf7fa3a17133670a8ec05c: Status 404 returned error can't find the container with id a5a516befaa29943cc85d46419c6ffc35c432e73fbbf7fa3a17133670a8ec05c Apr 24 21:27:24.092686 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:24.092663 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4586cbe_e12f_4084_9d26_5a60d4858635.slice/crio-3c774180100f27044183917de0fddd4750242865312e91c8831df0ff319c45ac WatchSource:0}: Error finding container 3c774180100f27044183917de0fddd4750242865312e91c8831df0ff319c45ac: Status 404 returned error can't find the container with id 3c774180100f27044183917de0fddd4750242865312e91c8831df0ff319c45ac Apr 24 21:27:24.095264 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:24.095237 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b00449f_18b5_4507_83ec_4a003e10f7fb.slice/crio-0f9707c6c09ba956e8ac8b0f35ad1ba6491a0a4181a4b6519c7d529456f5489f WatchSource:0}: Error finding container 0f9707c6c09ba956e8ac8b0f35ad1ba6491a0a4181a4b6519c7d529456f5489f: Status 404 returned error can't find the container with id 0f9707c6c09ba956e8ac8b0f35ad1ba6491a0a4181a4b6519c7d529456f5489f Apr 24 21:27:24.098910 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:24.098890 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84da9595_3baf_4bee_854d_b2858b093de3.slice/crio-60230966ef549988233cc37fa4ad458b7680857d8a4b2f3fd6de845814103efa WatchSource:0}: Error finding container 60230966ef549988233cc37fa4ad458b7680857d8a4b2f3fd6de845814103efa: Status 404 returned error can't find the container with id 60230966ef549988233cc37fa4ad458b7680857d8a4b2f3fd6de845814103efa Apr 24 21:27:24.099986 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:24.099959 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad14c53e_e5b6_4cbb_9e60_af19eb6027a6.slice/crio-9469cc75cceadd832e5d40933f06b7be94da5d418ae8acaf821799e163710bd5 WatchSource:0}: Error finding container 9469cc75cceadd832e5d40933f06b7be94da5d418ae8acaf821799e163710bd5: Status 404 returned error can't find the container with id 9469cc75cceadd832e5d40933f06b7be94da5d418ae8acaf821799e163710bd5 Apr 24 21:27:24.101010 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:24.100991 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb4c0791_332c_4626_884a_8947b04761c9.slice/crio-6272ebe851d47f10cf00a38fed8f886d09528c479e56b6b397d4cb0536abed47 WatchSource:0}: Error finding container 6272ebe851d47f10cf00a38fed8f886d09528c479e56b6b397d4cb0536abed47: Status 404 returned error can't find the container with id 6272ebe851d47f10cf00a38fed8f886d09528c479e56b6b397d4cb0536abed47 Apr 24 21:27:24.102833 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:24.102732 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb3d63f4_067a_47a5_b441_e08cbb119ecd.slice/crio-7d126ce7331545c509e2ea7545268836015f158dbdab0b8d4f79c92d15cfa0eb WatchSource:0}: Error finding container 7d126ce7331545c509e2ea7545268836015f158dbdab0b8d4f79c92d15cfa0eb: Status 404 returned error can't find the container with id 7d126ce7331545c509e2ea7545268836015f158dbdab0b8d4f79c92d15cfa0eb Apr 24 21:27:24.105192 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:24.105171 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod964dbac9_11de_44a8_b2ea_152ca4914413.slice/crio-d21dcf50023b72dab892cf2ec9d15dcd3e0fbb2e81126c286c0a713cdbd26b1c WatchSource:0}: Error finding container d21dcf50023b72dab892cf2ec9d15dcd3e0fbb2e81126c286c0a713cdbd26b1c: Status 404 returned error can't find the container with id d21dcf50023b72dab892cf2ec9d15dcd3e0fbb2e81126c286c0a713cdbd26b1c Apr 24 21:27:24.116127 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.116105 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:24.116246 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:24.116232 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:24.116303 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:24.116283 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs podName:5dfd7cf1-e10a-410e-b412-be269391a904 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:25.116269727 +0000 UTC m=+4.225363189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs") pod "network-metrics-daemon-489tz" (UID: "5dfd7cf1-e10a-410e-b412-be269391a904") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:24.216934 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.216909 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qkm\" (UniqueName: \"kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm\") pod \"network-check-target-2f65f\" (UID: \"acca6a48-f7ff-4dec-82df-945011bc308d\") " pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:24.217059 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:24.217050 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:24.217113 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:24.217063 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:24.217113 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:24.217072 2581 projected.go:194] Error preparing data for projected volume kube-api-access-k5qkm for pod openshift-network-diagnostics/network-check-target-2f65f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:24.217177 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:24.217114 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm podName:acca6a48-f7ff-4dec-82df-945011bc308d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:25.217101688 +0000 UTC m=+4.326195165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5qkm" (UniqueName: "kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm") pod "network-check-target-2f65f" (UID: "acca6a48-f7ff-4dec-82df-945011bc308d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:24.432995 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.432890 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:22 +0000 UTC" deadline="2028-01-26 11:47:23.205258428 +0000 UTC" Apr 24 21:27:24.432995 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.432927 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15398h19m58.7723342s" Apr 24 21:27:24.535099 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.535006 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal" event={"ID":"245bce5339b00b6e9cfc0086658d8fb7","Type":"ContainerStarted","Data":"9891bad40df36cc77f5adcb7cb5a013fe8891f425ecb7d63a3626b93d7ff9bbc"} Apr 24 21:27:24.539978 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.539921 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-45fvz" event={"ID":"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6","Type":"ContainerStarted","Data":"9469cc75cceadd832e5d40933f06b7be94da5d418ae8acaf821799e163710bd5"} Apr 24 21:27:24.544454 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.544398 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" event={"ID":"e844355b-199a-4f53-993b-84603868363e","Type":"ContainerStarted","Data":"a5a516befaa29943cc85d46419c6ffc35c432e73fbbf7fa3a17133670a8ec05c"} Apr 24 21:27:24.549111 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.548527 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" event={"ID":"f4586cbe-e12f-4084-9d26-5a60d4858635","Type":"ContainerStarted","Data":"3c774180100f27044183917de0fddd4750242865312e91c8831df0ff319c45ac"} Apr 24 21:27:24.549111 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.548874 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-31.ec2.internal" podStartSLOduration=2.548861679 podStartE2EDuration="2.548861679s" podCreationTimestamp="2026-04-24 21:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:24.548211735 +0000 UTC m=+3.657305224" watchObservedRunningTime="2026-04-24 21:27:24.548861679 +0000 UTC m=+3.657955165" Apr 24 21:27:24.552972 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.550646 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" event={"ID":"db3d63f4-067a-47a5-b441-e08cbb119ecd","Type":"ContainerStarted","Data":"7d126ce7331545c509e2ea7545268836015f158dbdab0b8d4f79c92d15cfa0eb"} Apr 24 21:27:24.557106 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.553278 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2ptsg" event={"ID":"cb4c0791-332c-4626-884a-8947b04761c9","Type":"ContainerStarted","Data":"6272ebe851d47f10cf00a38fed8f886d09528c479e56b6b397d4cb0536abed47"} Apr 24 21:27:24.557106 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.554483 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hjwlf" event={"ID":"964dbac9-11de-44a8-b2ea-152ca4914413","Type":"ContainerStarted","Data":"d21dcf50023b72dab892cf2ec9d15dcd3e0fbb2e81126c286c0a713cdbd26b1c"} Apr 24 21:27:24.560662 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.560638 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" event={"ID":"84da9595-3baf-4bee-854d-b2858b093de3","Type":"ContainerStarted","Data":"60230966ef549988233cc37fa4ad458b7680857d8a4b2f3fd6de845814103efa"} Apr 24 21:27:24.564724 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.564699 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kplsb" event={"ID":"4b00449f-18b5-4507-83ec-4a003e10f7fb","Type":"ContainerStarted","Data":"0f9707c6c09ba956e8ac8b0f35ad1ba6491a0a4181a4b6519c7d529456f5489f"} Apr 24 21:27:24.574469 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:24.574079 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h4z9c" event={"ID":"61331cbf-bfdf-44cd-895b-21d09c03e3a3","Type":"ContainerStarted","Data":"1ce3148ba0b443fc1439da503cea0e28c582abdc147bb4d7c2cd3a35d89fc714"} Apr 24 21:27:25.134811 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:25.134770 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:25.134990 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:25.134932 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:25.135079 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:25.135001 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs podName:5dfd7cf1-e10a-410e-b412-be269391a904 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.134981716 +0000 UTC m=+6.244075193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs") pod "network-metrics-daemon-489tz" (UID: "5dfd7cf1-e10a-410e-b412-be269391a904") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:25.235782 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:25.235160 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qkm\" (UniqueName: \"kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm\") pod \"network-check-target-2f65f\" (UID: \"acca6a48-f7ff-4dec-82df-945011bc308d\") " pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:25.235782 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:25.235365 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:25.235782 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:25.235383 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:25.235782 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:25.235396 2581 projected.go:194] Error preparing data for projected volume kube-api-access-k5qkm for pod openshift-network-diagnostics/network-check-target-2f65f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:25.235782 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:25.235453 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm podName:acca6a48-f7ff-4dec-82df-945011bc308d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:27.235434035 +0000 UTC m=+6.344527505 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5qkm" (UniqueName: "kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm") pod "network-check-target-2f65f" (UID: "acca6a48-f7ff-4dec-82df-945011bc308d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:25.519275 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:25.519242 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:25.519712 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:25.519382 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:25.519712 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:25.519446 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:25.519712 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:25.519539 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:25.599967 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:25.599921 2581 generic.go:358] "Generic (PLEG): container finished" podID="72565ecde5f90cfca62d0623671f5130" containerID="516d5a665bc30204b1ee656f737428ad56d1390cb3c7b7c4f1c0c6275d71025f" exitCode=0 Apr 24 21:27:25.600983 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:25.600956 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" event={"ID":"72565ecde5f90cfca62d0623671f5130","Type":"ContainerDied","Data":"516d5a665bc30204b1ee656f737428ad56d1390cb3c7b7c4f1c0c6275d71025f"} Apr 24 21:27:26.618659 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:26.617962 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" event={"ID":"72565ecde5f90cfca62d0623671f5130","Type":"ContainerStarted","Data":"1fd4244fe82f0da5e4c954f7f32692d31fbaef0d19dad6f6b5e82bfd0c856b26"} Apr 24 21:27:27.150331 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:27.150297 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:27.150519 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:27.150457 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:27.150592 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:27.150524 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs podName:5dfd7cf1-e10a-410e-b412-be269391a904 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.150504726 +0000 UTC m=+10.259598202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs") pod "network-metrics-daemon-489tz" (UID: "5dfd7cf1-e10a-410e-b412-be269391a904") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:27.251080 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:27.251037 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qkm\" (UniqueName: \"kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm\") pod \"network-check-target-2f65f\" (UID: \"acca6a48-f7ff-4dec-82df-945011bc308d\") " pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:27.251266 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:27.251208 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:27.251266 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:27.251232 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:27.251266 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:27.251245 2581 projected.go:194] Error preparing data for projected volume kube-api-access-k5qkm for pod openshift-network-diagnostics/network-check-target-2f65f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:27.251419 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:27.251305 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm podName:acca6a48-f7ff-4dec-82df-945011bc308d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.251287204 +0000 UTC m=+10.360380679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5qkm" (UniqueName: "kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm") pod "network-check-target-2f65f" (UID: "acca6a48-f7ff-4dec-82df-945011bc308d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:27.522388 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:27.519177 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:27.522388 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:27.519315 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:27.522388 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:27.519722 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:27.522388 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:27.519824 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:29.518695 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:29.518664 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:29.519166 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:29.518789 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:29.519166 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:29.519119 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:29.519268 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:29.519202 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:31.185693 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:31.185638 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:31.186172 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:31.185853 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:31.186172 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:31.185922 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs podName:5dfd7cf1-e10a-410e-b412-be269391a904 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:39.185900458 +0000 UTC m=+18.294993936 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs") pod "network-metrics-daemon-489tz" (UID: "5dfd7cf1-e10a-410e-b412-be269391a904") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:31.286784 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:31.286741 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qkm\" (UniqueName: \"kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm\") pod \"network-check-target-2f65f\" (UID: \"acca6a48-f7ff-4dec-82df-945011bc308d\") " pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:31.286959 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:31.286924 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:31.286959 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:31.286947 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:31.286959 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:31.286960 2581 projected.go:194] Error preparing data for projected volume kube-api-access-k5qkm for pod openshift-network-diagnostics/network-check-target-2f65f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:31.287147 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:31.287035 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm podName:acca6a48-f7ff-4dec-82df-945011bc308d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:39.287000077 +0000 UTC m=+18.396093546 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5qkm" (UniqueName: "kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm") pod "network-check-target-2f65f" (UID: "acca6a48-f7ff-4dec-82df-945011bc308d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:31.519373 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:31.519287 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:31.519517 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:31.519420 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:31.519938 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:31.519790 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:31.519938 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:31.519887 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:33.518488 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:33.518453 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:33.518887 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:33.518569 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:33.518887 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:33.518862 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:33.518958 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:33.518938 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:34.988322 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:34.988275 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-31.ec2.internal" podStartSLOduration=12.988260052 podStartE2EDuration="12.988260052s" podCreationTimestamp="2026-04-24 21:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:26.635364693 +0000 UTC m=+5.744458180" watchObservedRunningTime="2026-04-24 21:27:34.988260052 +0000 UTC m=+14.097353536" Apr 24 21:27:34.988805 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:34.988785 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rklj9"] Apr 24 21:27:34.991498 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:34.991474 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:34.991619 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:34.991548 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:35.114311 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.114280 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/151371d6-8756-495b-8181-7fdcb156d1f4-dbus\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:35.114476 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.114343 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/151371d6-8756-495b-8181-7fdcb156d1f4-kubelet-config\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:35.114476 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.114368 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:35.215679 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.215644 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/151371d6-8756-495b-8181-7fdcb156d1f4-kubelet-config\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:35.215679 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.215681 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:35.215885 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.215756 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/151371d6-8756-495b-8181-7fdcb156d1f4-dbus\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:35.215885 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.215793 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/151371d6-8756-495b-8181-7fdcb156d1f4-kubelet-config\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:35.215885 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:35.215859 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:35.216004 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.215892 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/151371d6-8756-495b-8181-7fdcb156d1f4-dbus\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:35.216004 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:35.215920 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret podName:151371d6-8756-495b-8181-7fdcb156d1f4 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:35.715901517 +0000 UTC m=+14.824994983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret") pod "global-pull-secret-syncer-rklj9" (UID: "151371d6-8756-495b-8181-7fdcb156d1f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:35.518933 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.518905 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:35.519114 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:35.519046 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:35.519114 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.519092 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:35.519209 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:35.519179 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:35.720192 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:35.720160 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:35.720364 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:35.720292 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:35.720409 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:35.720365 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret podName:151371d6-8756-495b-8181-7fdcb156d1f4 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:36.720345952 +0000 UTC m=+15.829439457 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret") pod "global-pull-secret-syncer-rklj9" (UID: "151371d6-8756-495b-8181-7fdcb156d1f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:36.518915 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:36.518884 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:36.519351 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:36.518985 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:36.727113 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:36.727074 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:36.727292 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:36.727209 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:36.727292 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:36.727277 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret podName:151371d6-8756-495b-8181-7fdcb156d1f4 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.727260087 +0000 UTC m=+17.836353553 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret") pod "global-pull-secret-syncer-rklj9" (UID: "151371d6-8756-495b-8181-7fdcb156d1f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:37.518804 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:37.518766 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:37.518970 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:37.518822 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:37.518970 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:37.518905 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:37.519413 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:37.519058 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:38.519349 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:38.519310 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:38.519870 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:38.519453 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:38.739741 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:38.739702 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:38.739936 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:38.739858 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:38.739936 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:38.739924 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret podName:151371d6-8756-495b-8181-7fdcb156d1f4 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:42.739905081 +0000 UTC m=+21.848998544 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret") pod "global-pull-secret-syncer-rklj9" (UID: "151371d6-8756-495b-8181-7fdcb156d1f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:39.244006 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:39.243964 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:39.244179 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:39.244082 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:39.244179 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:39.244144 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs podName:5dfd7cf1-e10a-410e-b412-be269391a904 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:55.244127678 +0000 UTC m=+34.353221160 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs") pod "network-metrics-daemon-489tz" (UID: "5dfd7cf1-e10a-410e-b412-be269391a904") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:39.345085 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:39.345046 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qkm\" (UniqueName: \"kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm\") pod \"network-check-target-2f65f\" (UID: \"acca6a48-f7ff-4dec-82df-945011bc308d\") " pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:39.345278 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:39.345105 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:39.345278 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:39.345132 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:39.345278 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:39.345146 2581 projected.go:194] Error preparing data for projected volume kube-api-access-k5qkm for pod openshift-network-diagnostics/network-check-target-2f65f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:39.345278 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:39.345212 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm podName:acca6a48-f7ff-4dec-82df-945011bc308d nodeName:}" failed. No retries permitted until 2026-04-24 21:27:55.345193041 +0000 UTC m=+34.454286548 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-k5qkm" (UniqueName: "kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm") pod "network-check-target-2f65f" (UID: "acca6a48-f7ff-4dec-82df-945011bc308d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:39.518812 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:39.518725 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:39.518950 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:39.518858 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:39.518950 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:39.518890 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:39.519073 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:39.518967 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:40.519222 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:40.519181 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:40.519623 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:40.519319 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:41.519378 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.519175 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:41.520005 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:41.519484 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:41.520005 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.519234 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:41.520005 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:41.519659 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:41.643757 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.643728 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" event={"ID":"db3d63f4-067a-47a5-b441-e08cbb119ecd","Type":"ContainerStarted","Data":"0cd332542fac62d38ed41ab1752f6ee3abd0d3f73565c4a045b4b3ac26545fa9"} Apr 24 21:27:41.645343 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.645312 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2ptsg" event={"ID":"cb4c0791-332c-4626-884a-8947b04761c9","Type":"ContainerStarted","Data":"90812ff53b725d37a66aba3f2c0d1d8a5ef84e7bb2fc6543ba7c5c9f536a3b2d"} Apr 24 21:27:41.646728 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.646705 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hjwlf" event={"ID":"964dbac9-11de-44a8-b2ea-152ca4914413","Type":"ContainerStarted","Data":"351010beea9fba1b2c74a5df6904c5cf2882c84985637ba1179a0894b6de91cb"} Apr 24 21:27:41.647933 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.647906 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" event={"ID":"84da9595-3baf-4bee-854d-b2858b093de3","Type":"ContainerStarted","Data":"71338a9a45ebe80a335a2752761ae5ce17f9200585c0302c001e69b456aa7751"} Apr 24 21:27:41.649564 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.649522 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kplsb" event={"ID":"4b00449f-18b5-4507-83ec-4a003e10f7fb","Type":"ContainerStarted","Data":"8f54489489a0529608455498d02a6428eebcdcad7162ec6ee9903f8ee6cded8c"} Apr 24 21:27:41.652814 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.652779 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h4z9c" event={"ID":"61331cbf-bfdf-44cd-895b-21d09c03e3a3","Type":"ContainerStarted","Data":"aa068e880d8bec4bbeda969da0c57e8dc8267e65b332d265450212aa35a70ffd"} Apr 24 21:27:41.654172 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.654150 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" event={"ID":"e844355b-199a-4f53-993b-84603868363e","Type":"ContainerStarted","Data":"6fd2dfe6ba60529889335e73170fed455dc83f94fd8765a394e85bba955a2b2a"} Apr 24 21:27:41.655389 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.655368 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" event={"ID":"f4586cbe-e12f-4084-9d26-5a60d4858635","Type":"ContainerStarted","Data":"4276f73d964cfc109a3c0b22c0b1995d1309af34f344ab177d6a079f77b98f87"} Apr 24 21:27:41.663193 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.663150 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2ptsg" podStartSLOduration=3.556322067 podStartE2EDuration="20.663138855s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:24.105336238 +0000 UTC m=+3.214429700" lastFinishedPulling="2026-04-24 21:27:41.212153012 +0000 UTC m=+20.321246488" observedRunningTime="2026-04-24 21:27:41.662231786 +0000 UTC m=+20.771325270" watchObservedRunningTime="2026-04-24 21:27:41.663138855 +0000 UTC m=+20.772232340" Apr 24 21:27:41.680380 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.680252 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hjwlf" podStartSLOduration=3.618337645 podStartE2EDuration="20.680233123s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:24.106408512 +0000 UTC m=+3.215501978" lastFinishedPulling="2026-04-24 21:27:41.168303979 +0000 UTC m=+20.277397456" observedRunningTime="2026-04-24 21:27:41.679609198 +0000 UTC m=+20.788702680" watchObservedRunningTime="2026-04-24 21:27:41.680233123 +0000 UTC m=+20.789326609" Apr 24 21:27:41.698270 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.698190 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kplsb" podStartSLOduration=3.623019777 podStartE2EDuration="20.698170664s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:24.097668567 +0000 UTC m=+3.206762029" lastFinishedPulling="2026-04-24 21:27:41.172819437 +0000 UTC m=+20.281912916" observedRunningTime="2026-04-24 21:27:41.696283083 +0000 UTC m=+20.805376572" watchObservedRunningTime="2026-04-24 21:27:41.698170664 +0000 UTC m=+20.807264150" Apr 24 21:27:41.749055 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.748992 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lnqjv" podStartSLOduration=3.673839074 podStartE2EDuration="20.748975902s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:24.100875941 +0000 UTC m=+3.209969403" lastFinishedPulling="2026-04-24 21:27:41.176012765 +0000 UTC m=+20.285106231" observedRunningTime="2026-04-24 21:27:41.748658734 +0000 UTC m=+20.857752325" watchObservedRunningTime="2026-04-24 21:27:41.748975902 +0000 UTC m=+20.858069369" Apr 24 21:27:41.764895 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.764840 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h4z9c" podStartSLOduration=8.063006462 podStartE2EDuration="20.764820416s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:24.092676605 +0000 UTC m=+3.201770069" lastFinishedPulling="2026-04-24 21:27:36.794490558 +0000 UTC m=+15.903584023" observedRunningTime="2026-04-24 21:27:41.764567749 +0000 UTC m=+20.873661233" watchObservedRunningTime="2026-04-24 21:27:41.764820416 +0000 UTC m=+20.873913902" Apr 24 21:27:41.840682 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.840641 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:41.841109 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:41.841087 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:42.474080 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.473915 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:42.475569 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.475481 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:42.474075903Z","UUID":"b1b69566-96f6-4841-be01-0858ecf5e548","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:42.476952 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.476938 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:42.477069 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.476960 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:42.519359 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.519327 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:42.519522 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:42.519427 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:42.658552 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.658515 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" event={"ID":"e844355b-199a-4f53-993b-84603868363e","Type":"ContainerStarted","Data":"380416a51da912cd425e72ba7127b3a6e191e5574031606abb577db66fbbc860"} Apr 24 21:27:42.659923 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.659894 2581 generic.go:358] "Generic (PLEG): container finished" podID="f4586cbe-e12f-4084-9d26-5a60d4858635" containerID="4276f73d964cfc109a3c0b22c0b1995d1309af34f344ab177d6a079f77b98f87" exitCode=0 Apr 24 21:27:42.660068 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.659965 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" event={"ID":"f4586cbe-e12f-4084-9d26-5a60d4858635","Type":"ContainerDied","Data":"4276f73d964cfc109a3c0b22c0b1995d1309af34f344ab177d6a079f77b98f87"} Apr 24 21:27:42.662594 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.662580 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:27:42.662894 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.662869 2581 generic.go:358] "Generic (PLEG): container finished" podID="db3d63f4-067a-47a5-b441-e08cbb119ecd" containerID="f654ab9f79b8ffd5c95c5ff14c15a34b6a5bc22e3b775e3e3fecacdb2f75b858" exitCode=1 Apr 24 21:27:42.662995 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.662966 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" event={"ID":"db3d63f4-067a-47a5-b441-e08cbb119ecd","Type":"ContainerDied","Data":"f654ab9f79b8ffd5c95c5ff14c15a34b6a5bc22e3b775e3e3fecacdb2f75b858"} Apr 24 21:27:42.663072 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.663004 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" event={"ID":"db3d63f4-067a-47a5-b441-e08cbb119ecd","Type":"ContainerStarted","Data":"f79f18c6bfe5710865ac783ea27007c098afdc74218c039ccb2907b401d95e6b"} Apr 24 21:27:42.663072 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.663037 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" event={"ID":"db3d63f4-067a-47a5-b441-e08cbb119ecd","Type":"ContainerStarted","Data":"906d3d8cfe0004491c85e810a275dc6900d9eec4a9364f8d6e5fdf7393607dea"} Apr 24 21:27:42.663072 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.663053 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" event={"ID":"db3d63f4-067a-47a5-b441-e08cbb119ecd","Type":"ContainerStarted","Data":"9fc8ad5fcfec58090e20ba545f1b4d48d9ee2d9562cc5a59879e53f8404bcb2d"} Apr 24 21:27:42.663072 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.663068 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" event={"ID":"db3d63f4-067a-47a5-b441-e08cbb119ecd","Type":"ContainerStarted","Data":"5b9031ae139f0a3a17ee512ab6906ff628900f88146b237df91363a4176a2f39"} Apr 24 21:27:42.664809 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.664788 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:42.665597 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.665572 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h4z9c" Apr 24 21:27:42.771297 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:42.771244 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:42.771519 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:42.771309 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:42.771519 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:42.771363 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret podName:151371d6-8756-495b-8181-7fdcb156d1f4 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:50.771348819 +0000 UTC m=+29.880442287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret") pod "global-pull-secret-syncer-rklj9" (UID: "151371d6-8756-495b-8181-7fdcb156d1f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:43.518884 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:43.518849 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:43.519141 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:43.518991 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:43.519212 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:43.519134 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:43.519318 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:43.519295 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:43.666755 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:43.666722 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-45fvz" event={"ID":"ad14c53e-e5b6-4cbb-9e60-af19eb6027a6","Type":"ContainerStarted","Data":"bcaf22154f9a7e0fcd0dc12c4c45092739980b3543e503abb1a13d517480ffaa"} Apr 24 21:27:43.682558 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:43.682518 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-45fvz" podStartSLOduration=5.605713668 podStartE2EDuration="22.682504457s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:24.101903255 +0000 UTC m=+3.210996732" lastFinishedPulling="2026-04-24 21:27:41.17869404 +0000 UTC m=+20.287787521" observedRunningTime="2026-04-24 21:27:43.681696928 +0000 UTC m=+22.790790421" watchObservedRunningTime="2026-04-24 21:27:43.682504457 +0000 UTC m=+22.791597941" Apr 24 21:27:44.519150 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:44.519114 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:44.519317 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:44.519239 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:44.674568 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:44.674488 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:27:44.675014 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:44.674882 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" event={"ID":"db3d63f4-067a-47a5-b441-e08cbb119ecd","Type":"ContainerStarted","Data":"c1f3e63ad9b9a5e9c69379e160044fd88efe1591cc191c3bcfa5ff7aaa3d8d90"} Apr 24 21:27:44.676957 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:44.676925 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" event={"ID":"e844355b-199a-4f53-993b-84603868363e","Type":"ContainerStarted","Data":"56dcef6d01335806291abc326efaed9c6647dd13235f6f96f66ad0d8a7e88d75"} Apr 24 21:27:44.695093 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:44.695050 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x5lzw" podStartSLOduration=4.260860471 podStartE2EDuration="23.695014678s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:24.093933156 +0000 UTC m=+3.203026619" lastFinishedPulling="2026-04-24 21:27:43.528087363 +0000 UTC m=+22.637180826" observedRunningTime="2026-04-24 21:27:44.694797113 +0000 UTC m=+23.803890597" watchObservedRunningTime="2026-04-24 21:27:44.695014678 +0000 UTC m=+23.804108161" Apr 24 21:27:45.522111 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:45.522070 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:45.522293 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:45.522071 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:45.522293 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:45.522187 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:45.522403 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:45.522291 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:46.519326 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:46.519294 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:46.519973 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:46.519415 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:47.521561 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.521531 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:47.521964 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.521539 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:47.521964 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:47.521730 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:47.521964 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:47.521618 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:47.683916 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.683877 2581 generic.go:358] "Generic (PLEG): container finished" podID="f4586cbe-e12f-4084-9d26-5a60d4858635" containerID="3e273b1506dd0b71caa31e3745784726f9cf8369becdbf9d63caaaf30c0e169c" exitCode=0 Apr 24 21:27:47.684118 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.683962 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" event={"ID":"f4586cbe-e12f-4084-9d26-5a60d4858635","Type":"ContainerDied","Data":"3e273b1506dd0b71caa31e3745784726f9cf8369becdbf9d63caaaf30c0e169c"} Apr 24 21:27:47.687094 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.687076 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:27:47.687396 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.687374 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" event={"ID":"db3d63f4-067a-47a5-b441-e08cbb119ecd","Type":"ContainerStarted","Data":"338fc9a03000f2aa06ca3ba3764c16cb979b5212cc3f4c33dccd2c82c7d5ee49"} Apr 24 21:27:47.687681 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.687665 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:47.687740 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.687689 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:47.687740 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.687701 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:47.687917 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.687897 2581 scope.go:117] "RemoveContainer" containerID="f654ab9f79b8ffd5c95c5ff14c15a34b6a5bc22e3b775e3e3fecacdb2f75b858" Apr 24 21:27:47.703183 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.703163 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:47.703420 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:47.703407 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:27:48.519310 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.519174 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:48.519514 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:48.519407 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:48.618563 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.618534 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rklj9"] Apr 24 21:27:48.620533 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.620511 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-489tz"] Apr 24 21:27:48.620628 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.620620 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:48.620737 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:48.620717 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:48.630877 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.630852 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2f65f"] Apr 24 21:27:48.630968 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.630939 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:48.631042 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:48.631010 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:48.692380 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.692307 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:27:48.692720 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.692672 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" event={"ID":"db3d63f4-067a-47a5-b441-e08cbb119ecd","Type":"ContainerStarted","Data":"b6e76f9e4a6c04dfa5a8893634a8eb09c809fb6057f7327f192d09e05e02387e"} Apr 24 21:27:48.694514 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.694487 2581 generic.go:358] "Generic (PLEG): container finished" podID="f4586cbe-e12f-4084-9d26-5a60d4858635" containerID="989d43cdd136538a304c75201d0993223450fede0cb6b17d8f91c5d841ac8cda" exitCode=0 Apr 24 21:27:48.694626 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.694558 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:48.694626 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.694568 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" event={"ID":"f4586cbe-e12f-4084-9d26-5a60d4858635","Type":"ContainerDied","Data":"989d43cdd136538a304c75201d0993223450fede0cb6b17d8f91c5d841ac8cda"} Apr 24 21:27:48.694705 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:48.694663 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:48.756351 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:48.756301 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" podStartSLOduration=10.439518834 podStartE2EDuration="27.7562866s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:24.106124389 +0000 UTC m=+3.215217852" lastFinishedPulling="2026-04-24 21:27:41.422892146 +0000 UTC m=+20.531985618" observedRunningTime="2026-04-24 21:27:48.724757521 +0000 UTC m=+27.833851016" watchObservedRunningTime="2026-04-24 21:27:48.7562866 +0000 UTC m=+27.865380120" Apr 24 21:27:49.698365 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:49.698331 2581 generic.go:358] "Generic (PLEG): container finished" podID="f4586cbe-e12f-4084-9d26-5a60d4858635" containerID="feb50685e19574f22642bedfc04b566e178eed801450bd80c94f02445b34fa9a" exitCode=0 Apr 24 21:27:49.698818 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:49.698427 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" event={"ID":"f4586cbe-e12f-4084-9d26-5a60d4858635","Type":"ContainerDied","Data":"feb50685e19574f22642bedfc04b566e178eed801450bd80c94f02445b34fa9a"} Apr 24 21:27:50.519220 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:50.519184 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:50.519220 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:50.519203 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:50.519451 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:50.519307 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:50.519451 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:50.519329 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:50.519451 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:50.519371 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:50.519597 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:50.519457 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:50.828078 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:50.828038 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:50.828525 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:50.828140 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:50.828525 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:50.828223 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret podName:151371d6-8756-495b-8181-7fdcb156d1f4 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:06.82820115 +0000 UTC m=+45.937294618 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret") pod "global-pull-secret-syncer-rklj9" (UID: "151371d6-8756-495b-8181-7fdcb156d1f4") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:52.518807 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:52.518773 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:52.519426 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:52.518897 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2f65f" podUID="acca6a48-f7ff-4dec-82df-945011bc308d" Apr 24 21:27:52.519426 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:52.518905 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:52.519426 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:52.518924 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:52.519426 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:52.518984 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rklj9" podUID="151371d6-8756-495b-8181-7fdcb156d1f4" Apr 24 21:27:52.519426 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:52.519089 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-489tz" podUID="5dfd7cf1-e10a-410e-b412-be269391a904" Apr 24 21:27:54.201117 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.201043 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-31.ec2.internal" event="NodeReady" Apr 24 21:27:54.201567 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.201197 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:27:54.252309 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.251574 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-579b557bd8-ns9mf"] Apr 24 21:27:54.256300 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.256280 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.258766 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.258742 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:27:54.258872 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.258840 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bckfb\"" Apr 24 21:27:54.259718 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.259548 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:27:54.259718 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.259643 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:27:54.265899 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.265874 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:27:54.272719 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.272694 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zdbnj"] Apr 24 21:27:54.276671 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.276646 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-sdjvb"] Apr 24 21:27:54.276844 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.276822 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:27:54.280187 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.279924 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:27:54.280187 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.279981 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:27:54.280187 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.280050 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:27:54.280187 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.280101 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-gftjb\"" Apr 24 21:27:54.283075 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.283056 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-579b557bd8-ns9mf"] Apr 24 21:27:54.283183 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.283172 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.285694 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.285677 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:27:54.285794 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.285774 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:27:54.285978 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.285726 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wtdzs\"" Apr 24 21:27:54.286136 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.286116 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zdbnj"] Apr 24 21:27:54.304045 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.304004 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sdjvb"] Apr 24 21:27:54.355781 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.355744 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57bed6cb-14ec-45db-98ca-49d0e2a82730-config-volume\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.355781 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.355776 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnq2h\" (UniqueName: \"kubernetes.io/projected/57bed6cb-14ec-45db-98ca-49d0e2a82730-kube-api-access-tnq2h\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.355995 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.355791 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:27:54.355995 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.355882 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2e08127c-5e30-4001-8d57-fe051bc907b1-ca-trust-extracted\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.355995 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.355920 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-certificates\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.355995 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.355950 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-bound-sa-token\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.356192 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.355999 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-trusted-ca\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.356192 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.356052 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4lwc\" (UniqueName: \"kubernetes.io/projected/78d3eb37-5559-42a5-b81b-c2219787cc5b-kube-api-access-p4lwc\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:27:54.356192 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.356097 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57bed6cb-14ec-45db-98ca-49d0e2a82730-tmp-dir\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.356192 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.356149 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-image-registry-private-configuration\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.356192 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.356184 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdhd5\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-kube-api-access-xdhd5\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.356380 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.356228 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-installation-pull-secrets\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.356380 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.356255 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.356380 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.356280 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.457177 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457082 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-installation-pull-secrets\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.457177 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457137 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.457177 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457161 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.457435 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457188 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57bed6cb-14ec-45db-98ca-49d0e2a82730-config-volume\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.457435 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457208 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnq2h\" (UniqueName: \"kubernetes.io/projected/57bed6cb-14ec-45db-98ca-49d0e2a82730-kube-api-access-tnq2h\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.457435 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457232 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:27:54.457435 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457274 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2e08127c-5e30-4001-8d57-fe051bc907b1-ca-trust-extracted\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.457435 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457297 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-certificates\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.457435 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.457319 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:54.457435 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.457386 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:54.457753 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457319 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-bound-sa-token\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.457753 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.457318 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:54.457753 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.457540 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579b557bd8-ns9mf: secret "image-registry-tls" not found Apr 24 21:27:54.457753 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.457394 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls podName:57bed6cb-14ec-45db-98ca-49d0e2a82730 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:54.957374176 +0000 UTC m=+34.066467652 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls") pod "dns-default-sdjvb" (UID: "57bed6cb-14ec-45db-98ca-49d0e2a82730") : secret "dns-default-metrics-tls" not found Apr 24 21:27:54.457753 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.457600 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert podName:78d3eb37-5559-42a5-b81b-c2219787cc5b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:54.957587956 +0000 UTC m=+34.066681430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert") pod "ingress-canary-zdbnj" (UID: "78d3eb37-5559-42a5-b81b-c2219787cc5b") : secret "canary-serving-cert" not found Apr 24 21:27:54.457753 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.457619 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls podName:2e08127c-5e30-4001-8d57-fe051bc907b1 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:54.957609761 +0000 UTC m=+34.066703237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls") pod "image-registry-579b557bd8-ns9mf" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1") : secret "image-registry-tls" not found Apr 24 21:27:54.458100 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457677 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-trusted-ca\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.458100 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457928 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4lwc\" (UniqueName: \"kubernetes.io/projected/78d3eb37-5559-42a5-b81b-c2219787cc5b-kube-api-access-p4lwc\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:27:54.458100 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457807 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2e08127c-5e30-4001-8d57-fe051bc907b1-ca-trust-extracted\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.458100 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457972 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57bed6cb-14ec-45db-98ca-49d0e2a82730-config-volume\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.458100 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.457978 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57bed6cb-14ec-45db-98ca-49d0e2a82730-tmp-dir\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.458100 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.458088 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-image-registry-private-configuration\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.458394 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.458129 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdhd5\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-kube-api-access-xdhd5\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.458394 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.458242 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-certificates\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.458394 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.458243 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/57bed6cb-14ec-45db-98ca-49d0e2a82730-tmp-dir\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.458690 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.458667 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-trusted-ca\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.462313 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.462290 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-image-registry-private-configuration\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.462406 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.462339 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-installation-pull-secrets\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.465914 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.465896 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnq2h\" (UniqueName: \"kubernetes.io/projected/57bed6cb-14ec-45db-98ca-49d0e2a82730-kube-api-access-tnq2h\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.466003 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.465962 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-bound-sa-token\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.471135 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.471094 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdhd5\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-kube-api-access-xdhd5\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.471465 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.471447 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4lwc\" (UniqueName: \"kubernetes.io/projected/78d3eb37-5559-42a5-b81b-c2219787cc5b-kube-api-access-p4lwc\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:27:54.518693 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.518657 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:27:54.518842 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.518713 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:54.518842 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.518666 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:54.521324 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.521289 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:27:54.521324 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.521317 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kgq96\"" Apr 24 21:27:54.521480 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.521443 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:27:54.521568 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.521545 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:27:54.521669 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.521649 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:27:54.521768 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.521752 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cbm9w\"" Apr 24 21:27:54.961812 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.961775 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:54.961812 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.961814 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:54.962054 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:54.961844 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:27:54.962054 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.961931 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:54.962054 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.961945 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:54.962054 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.961965 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579b557bd8-ns9mf: secret "image-registry-tls" not found Apr 24 21:27:54.962054 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.961965 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:54.962054 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.962008 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls podName:57bed6cb-14ec-45db-98ca-49d0e2a82730 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:55.961990289 +0000 UTC m=+35.071083770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls") pod "dns-default-sdjvb" (UID: "57bed6cb-14ec-45db-98ca-49d0e2a82730") : secret "dns-default-metrics-tls" not found Apr 24 21:27:54.962054 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.962037 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert podName:78d3eb37-5559-42a5-b81b-c2219787cc5b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:55.962030982 +0000 UTC m=+35.071124445 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert") pod "ingress-canary-zdbnj" (UID: "78d3eb37-5559-42a5-b81b-c2219787cc5b") : secret "canary-serving-cert" not found Apr 24 21:27:54.962054 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:54.962051 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls podName:2e08127c-5e30-4001-8d57-fe051bc907b1 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:55.962042946 +0000 UTC m=+35.071136409 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls") pod "image-registry-579b557bd8-ns9mf" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1") : secret "image-registry-tls" not found Apr 24 21:27:55.264763 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.264686 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:27:55.265168 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:55.264815 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 21:27:55.265168 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:55.264891 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs podName:5dfd7cf1-e10a-410e-b412-be269391a904 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:27.264874123 +0000 UTC m=+66.373967586 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs") pod "network-metrics-daemon-489tz" (UID: "5dfd7cf1-e10a-410e-b412-be269391a904") : secret "metrics-daemon-secret" not found Apr 24 21:27:55.365626 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.365592 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qkm\" (UniqueName: \"kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm\") pod \"network-check-target-2f65f\" (UID: \"acca6a48-f7ff-4dec-82df-945011bc308d\") " pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:55.368275 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.368248 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5qkm\" (UniqueName: \"kubernetes.io/projected/acca6a48-f7ff-4dec-82df-945011bc308d-kube-api-access-k5qkm\") pod \"network-check-target-2f65f\" (UID: \"acca6a48-f7ff-4dec-82df-945011bc308d\") " pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:55.443570 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.443538 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:27:55.640302 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.640272 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k"] Apr 24 21:27:55.652450 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.652425 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k" Apr 24 21:27:55.652581 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.652540 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k"] Apr 24 21:27:55.654869 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.654839 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 21:27:55.654869 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.654863 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:55.655672 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.655654 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-wqhv4\"" Apr 24 21:27:55.666621 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.666596 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2f65f"] Apr 24 21:27:55.671037 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:55.670990 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacca6a48_f7ff_4dec_82df_945011bc308d.slice/crio-5de10d4cb0ba028b4807c365a4e4ab307835f9963e0735c34ecc2f539496c294 WatchSource:0}: Error finding container 5de10d4cb0ba028b4807c365a4e4ab307835f9963e0735c34ecc2f539496c294: Status 404 returned error can't find the container with id 5de10d4cb0ba028b4807c365a4e4ab307835f9963e0735c34ecc2f539496c294 Apr 24 21:27:55.712179 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.712137 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" event={"ID":"f4586cbe-e12f-4084-9d26-5a60d4858635","Type":"ContainerStarted","Data":"bad10d6235bdd1cc949f04516e4cb4e93f2566233f53c5677f15c6c82596856e"} Apr 24 21:27:55.713188 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.713157 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2f65f" event={"ID":"acca6a48-f7ff-4dec-82df-945011bc308d","Type":"ContainerStarted","Data":"5de10d4cb0ba028b4807c365a4e4ab307835f9963e0735c34ecc2f539496c294"} Apr 24 21:27:55.761622 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.761587 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2pb8s"] Apr 24 21:27:55.768977 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.768905 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h4l5\" (UniqueName: \"kubernetes.io/projected/67b04c4e-0208-4fe3-b15b-96d48c530953-kube-api-access-6h4l5\") pod \"migrator-74bb7799d9-qjc8k\" (UID: \"67b04c4e-0208-4fe3-b15b-96d48c530953\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k" Apr 24 21:27:55.776720 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.776695 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2pb8s"] Apr 24 21:27:55.776851 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.776832 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:55.780776 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.780752 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 24 21:27:55.780776 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.780759 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 24 21:27:55.780961 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.780762 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 24 21:27:55.780961 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.780818 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 24 21:27:55.781211 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.781184 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-gdjdx\"" Apr 24 21:27:55.870271 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.870238 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6h4l5\" (UniqueName: \"kubernetes.io/projected/67b04c4e-0208-4fe3-b15b-96d48c530953-kube-api-access-6h4l5\") pod \"migrator-74bb7799d9-qjc8k\" (UID: \"67b04c4e-0208-4fe3-b15b-96d48c530953\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k" Apr 24 21:27:55.870271 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.870276 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e3d4ccc3-bec5-4593-b3a1-deedb5979450-signing-cabundle\") pod \"service-ca-865cb79987-2pb8s\" (UID: \"e3d4ccc3-bec5-4593-b3a1-deedb5979450\") " pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:55.870510 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.870307 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss2kk\" (UniqueName: \"kubernetes.io/projected/e3d4ccc3-bec5-4593-b3a1-deedb5979450-kube-api-access-ss2kk\") pod \"service-ca-865cb79987-2pb8s\" (UID: \"e3d4ccc3-bec5-4593-b3a1-deedb5979450\") " pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:55.870510 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.870343 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e3d4ccc3-bec5-4593-b3a1-deedb5979450-signing-key\") pod \"service-ca-865cb79987-2pb8s\" (UID: \"e3d4ccc3-bec5-4593-b3a1-deedb5979450\") " pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:55.882254 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.882112 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h4l5\" (UniqueName: \"kubernetes.io/projected/67b04c4e-0208-4fe3-b15b-96d48c530953-kube-api-access-6h4l5\") pod \"migrator-74bb7799d9-qjc8k\" (UID: \"67b04c4e-0208-4fe3-b15b-96d48c530953\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k" Apr 24 21:27:55.961975 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.961946 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k" Apr 24 21:27:55.970846 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.970825 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e3d4ccc3-bec5-4593-b3a1-deedb5979450-signing-cabundle\") pod \"service-ca-865cb79987-2pb8s\" (UID: \"e3d4ccc3-bec5-4593-b3a1-deedb5979450\") " pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:55.970907 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.970882 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss2kk\" (UniqueName: \"kubernetes.io/projected/e3d4ccc3-bec5-4593-b3a1-deedb5979450-kube-api-access-ss2kk\") pod \"service-ca-865cb79987-2pb8s\" (UID: \"e3d4ccc3-bec5-4593-b3a1-deedb5979450\") " pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:55.970938 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.970909 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e3d4ccc3-bec5-4593-b3a1-deedb5979450-signing-key\") pod \"service-ca-865cb79987-2pb8s\" (UID: \"e3d4ccc3-bec5-4593-b3a1-deedb5979450\") " pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:55.970974 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.970943 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:55.970974 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.970965 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:55.971087 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.970995 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:27:55.971142 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:55.971085 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:55.971142 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:55.971100 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:55.971142 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:55.971105 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579b557bd8-ns9mf: secret "image-registry-tls" not found Apr 24 21:27:55.971267 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:55.971102 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:55.971267 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:55.971167 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls podName:2e08127c-5e30-4001-8d57-fe051bc907b1 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:57.971145333 +0000 UTC m=+37.080238810 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls") pod "image-registry-579b557bd8-ns9mf" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1") : secret "image-registry-tls" not found Apr 24 21:27:55.971267 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:55.971184 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls podName:57bed6cb-14ec-45db-98ca-49d0e2a82730 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:57.971176246 +0000 UTC m=+37.080269708 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls") pod "dns-default-sdjvb" (UID: "57bed6cb-14ec-45db-98ca-49d0e2a82730") : secret "dns-default-metrics-tls" not found Apr 24 21:27:55.971267 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:55.971198 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert podName:78d3eb37-5559-42a5-b81b-c2219787cc5b nodeName:}" failed. No retries permitted until 2026-04-24 21:27:57.97119006 +0000 UTC m=+37.080283525 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert") pod "ingress-canary-zdbnj" (UID: "78d3eb37-5559-42a5-b81b-c2219787cc5b") : secret "canary-serving-cert" not found Apr 24 21:27:55.971635 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.971615 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e3d4ccc3-bec5-4593-b3a1-deedb5979450-signing-cabundle\") pod \"service-ca-865cb79987-2pb8s\" (UID: \"e3d4ccc3-bec5-4593-b3a1-deedb5979450\") " pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:55.973535 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.973512 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e3d4ccc3-bec5-4593-b3a1-deedb5979450-signing-key\") pod \"service-ca-865cb79987-2pb8s\" (UID: \"e3d4ccc3-bec5-4593-b3a1-deedb5979450\") " pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:55.983281 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:55.983254 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss2kk\" (UniqueName: \"kubernetes.io/projected/e3d4ccc3-bec5-4593-b3a1-deedb5979450-kube-api-access-ss2kk\") pod \"service-ca-865cb79987-2pb8s\" (UID: \"e3d4ccc3-bec5-4593-b3a1-deedb5979450\") " pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:56.085014 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:56.084967 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k"] Apr 24 21:27:56.086805 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:56.086785 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2pb8s" Apr 24 21:27:56.089241 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:56.089213 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67b04c4e_0208_4fe3_b15b_96d48c530953.slice/crio-43fb50f9888b225522e8675ec0ff87110c81f866e95e6e0c906df01bdd2211a8 WatchSource:0}: Error finding container 43fb50f9888b225522e8675ec0ff87110c81f866e95e6e0c906df01bdd2211a8: Status 404 returned error can't find the container with id 43fb50f9888b225522e8675ec0ff87110c81f866e95e6e0c906df01bdd2211a8 Apr 24 21:27:56.241802 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:56.241757 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2pb8s"] Apr 24 21:27:56.245946 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:27:56.245915 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3d4ccc3_bec5_4593_b3a1_deedb5979450.slice/crio-95ffbcca4ab07b5492047dd1c599be18f030aa0e8ab51e57549a0c3bbe28756f WatchSource:0}: Error finding container 95ffbcca4ab07b5492047dd1c599be18f030aa0e8ab51e57549a0c3bbe28756f: Status 404 returned error can't find the container with id 95ffbcca4ab07b5492047dd1c599be18f030aa0e8ab51e57549a0c3bbe28756f Apr 24 21:27:56.523013 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:56.522944 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hjwlf_964dbac9-11de-44a8-b2ea-152ca4914413/dns-node-resolver/0.log" Apr 24 21:27:56.717080 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:56.717044 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2pb8s" event={"ID":"e3d4ccc3-bec5-4593-b3a1-deedb5979450","Type":"ContainerStarted","Data":"95ffbcca4ab07b5492047dd1c599be18f030aa0e8ab51e57549a0c3bbe28756f"} Apr 24 21:27:56.720707 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:56.720578 2581 generic.go:358] "Generic (PLEG): container finished" podID="f4586cbe-e12f-4084-9d26-5a60d4858635" containerID="bad10d6235bdd1cc949f04516e4cb4e93f2566233f53c5677f15c6c82596856e" exitCode=0 Apr 24 21:27:56.720707 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:56.720667 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" event={"ID":"f4586cbe-e12f-4084-9d26-5a60d4858635","Type":"ContainerDied","Data":"bad10d6235bdd1cc949f04516e4cb4e93f2566233f53c5677f15c6c82596856e"} Apr 24 21:27:56.722893 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:56.722856 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k" event={"ID":"67b04c4e-0208-4fe3-b15b-96d48c530953","Type":"ContainerStarted","Data":"43fb50f9888b225522e8675ec0ff87110c81f866e95e6e0c906df01bdd2211a8"} Apr 24 21:27:57.727846 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:57.727808 2581 generic.go:358] "Generic (PLEG): container finished" podID="f4586cbe-e12f-4084-9d26-5a60d4858635" containerID="40d06d5d3b257e2365fb3340d1da9281a5e3742376c87769da2b718a6aaae022" exitCode=0 Apr 24 21:27:57.728507 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:57.727891 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" event={"ID":"f4586cbe-e12f-4084-9d26-5a60d4858635","Type":"ContainerDied","Data":"40d06d5d3b257e2365fb3340d1da9281a5e3742376c87769da2b718a6aaae022"} Apr 24 21:27:57.729375 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:57.729330 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k" event={"ID":"67b04c4e-0208-4fe3-b15b-96d48c530953","Type":"ContainerStarted","Data":"38df1fb81dc8314d36b7bf18c64b43f47d21d800b1046a9d69d8db775a0d30fb"} Apr 24 21:27:57.732663 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:57.732641 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kplsb_4b00449f-18b5-4507-83ec-4a003e10f7fb/node-ca/0.log" Apr 24 21:27:57.990798 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:57.990709 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:27:57.990798 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:57.990755 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:27:57.990798 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:27:57.990784 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:27:57.991091 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:57.990867 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:27:57.991091 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:57.990878 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:27:57.991091 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:57.990908 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579b557bd8-ns9mf: secret "image-registry-tls" not found Apr 24 21:27:57.991091 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:57.990879 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:27:57.991091 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:57.990944 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls podName:57bed6cb-14ec-45db-98ca-49d0e2a82730 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.990919359 +0000 UTC m=+41.100012823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls") pod "dns-default-sdjvb" (UID: "57bed6cb-14ec-45db-98ca-49d0e2a82730") : secret "dns-default-metrics-tls" not found Apr 24 21:27:57.991091 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:57.990987 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls podName:2e08127c-5e30-4001-8d57-fe051bc907b1 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.990969557 +0000 UTC m=+41.100063019 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls") pod "image-registry-579b557bd8-ns9mf" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1") : secret "image-registry-tls" not found Apr 24 21:27:57.991091 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:27:57.991009 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert podName:78d3eb37-5559-42a5-b81b-c2219787cc5b nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.990999142 +0000 UTC m=+41.100092616 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert") pod "ingress-canary-zdbnj" (UID: "78d3eb37-5559-42a5-b81b-c2219787cc5b") : secret "canary-serving-cert" not found Apr 24 21:28:00.737266 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:00.736990 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2f65f" event={"ID":"acca6a48-f7ff-4dec-82df-945011bc308d","Type":"ContainerStarted","Data":"b9bd682880e2b469c51dac006d49599c0e2ffa790d08cf801b2e1fb2a49f4b8e"} Apr 24 21:28:00.737266 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:00.737239 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:28:00.738631 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:00.738600 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k" event={"ID":"67b04c4e-0208-4fe3-b15b-96d48c530953","Type":"ContainerStarted","Data":"73c4997409594fbb2af6b71103e6a8b3fbbb45a5ae8796b4c2815dd30c842ad1"} Apr 24 21:28:00.739864 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:00.739844 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2pb8s" event={"ID":"e3d4ccc3-bec5-4593-b3a1-deedb5979450","Type":"ContainerStarted","Data":"36345f5c4a219adaba5789f005fff999c31a3ce58ebc91ea0111466c4a85b342"} Apr 24 21:28:00.746355 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:00.746332 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" event={"ID":"f4586cbe-e12f-4084-9d26-5a60d4858635","Type":"ContainerStarted","Data":"e3f139fbfd91212bb022282a36fd22d92f90575ab879213292a6c9f717fcd6ca"} Apr 24 21:28:00.755230 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:00.755103 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2f65f" podStartSLOduration=35.758659817 podStartE2EDuration="39.755090392s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:55.673385027 +0000 UTC m=+34.782478491" lastFinishedPulling="2026-04-24 21:27:59.669815596 +0000 UTC m=+38.778909066" observedRunningTime="2026-04-24 21:28:00.754580925 +0000 UTC m=+39.863674410" watchObservedRunningTime="2026-04-24 21:28:00.755090392 +0000 UTC m=+39.864183875" Apr 24 21:28:00.782144 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:00.782094 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-ps7q2" podStartSLOduration=8.36207775 podStartE2EDuration="39.782077076s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:27:24.095157306 +0000 UTC m=+3.204250769" lastFinishedPulling="2026-04-24 21:27:55.515156629 +0000 UTC m=+34.624250095" observedRunningTime="2026-04-24 21:28:00.780262967 +0000 UTC m=+39.889356465" watchObservedRunningTime="2026-04-24 21:28:00.782077076 +0000 UTC m=+39.891170562" Apr 24 21:28:00.799827 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:00.799773 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-qjc8k" podStartSLOduration=4.407198053 podStartE2EDuration="5.799759209s" podCreationTimestamp="2026-04-24 21:27:55 +0000 UTC" firstStartedPulling="2026-04-24 21:27:56.091655758 +0000 UTC m=+35.200749224" lastFinishedPulling="2026-04-24 21:27:57.484216917 +0000 UTC m=+36.593310380" observedRunningTime="2026-04-24 21:28:00.798785384 +0000 UTC m=+39.907878870" watchObservedRunningTime="2026-04-24 21:28:00.799759209 +0000 UTC m=+39.908852694" Apr 24 21:28:00.815537 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:00.815490 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-2pb8s" podStartSLOduration=2.398610277 podStartE2EDuration="5.815475748s" podCreationTimestamp="2026-04-24 21:27:55 +0000 UTC" firstStartedPulling="2026-04-24 21:27:56.248375553 +0000 UTC m=+35.357469029" lastFinishedPulling="2026-04-24 21:27:59.665241033 +0000 UTC m=+38.774334500" observedRunningTime="2026-04-24 21:28:00.814535825 +0000 UTC m=+39.923629310" watchObservedRunningTime="2026-04-24 21:28:00.815475748 +0000 UTC m=+39.924569232" Apr 24 21:28:02.025338 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:02.025296 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:28:02.025792 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:02.025346 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:28:02.025792 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:02.025464 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:02.025792 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:02.025479 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-579b557bd8-ns9mf: secret "image-registry-tls" not found Apr 24 21:28:02.025792 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:02.025508 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:02.025792 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:02.025544 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls podName:2e08127c-5e30-4001-8d57-fe051bc907b1 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:10.025524961 +0000 UTC m=+49.134618424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls") pod "image-registry-579b557bd8-ns9mf" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1") : secret "image-registry-tls" not found Apr 24 21:28:02.025792 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:02.025568 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls podName:57bed6cb-14ec-45db-98ca-49d0e2a82730 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:10.025550869 +0000 UTC m=+49.134644335 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls") pod "dns-default-sdjvb" (UID: "57bed6cb-14ec-45db-98ca-49d0e2a82730") : secret "dns-default-metrics-tls" not found Apr 24 21:28:02.025792 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:02.025596 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:28:02.025792 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:02.025700 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:02.025792 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:02.025748 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert podName:78d3eb37-5559-42a5-b81b-c2219787cc5b nodeName:}" failed. No retries permitted until 2026-04-24 21:28:10.025738039 +0000 UTC m=+49.134831506 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert") pod "ingress-canary-zdbnj" (UID: "78d3eb37-5559-42a5-b81b-c2219787cc5b") : secret "canary-serving-cert" not found Apr 24 21:28:06.861883 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:06.861839 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:28:06.865249 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:06.865224 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/151371d6-8756-495b-8181-7fdcb156d1f4-original-pull-secret\") pod \"global-pull-secret-syncer-rklj9\" (UID: \"151371d6-8756-495b-8181-7fdcb156d1f4\") " pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:28:07.130848 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:07.130771 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rklj9" Apr 24 21:28:07.264348 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:07.264208 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rklj9"] Apr 24 21:28:07.760306 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:07.760252 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rklj9" event={"ID":"151371d6-8756-495b-8181-7fdcb156d1f4","Type":"ContainerStarted","Data":"047d7617a98d218dc4337e31b94f93fd6c7fe4be3e6d791845925cca53dd31f6"} Apr 24 21:28:10.088841 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:10.088784 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:28:10.088841 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:10.088837 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:28:10.089458 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:10.088867 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:28:10.091903 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:10.091845 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/78d3eb37-5559-42a5-b81b-c2219787cc5b-cert\") pod \"ingress-canary-zdbnj\" (UID: \"78d3eb37-5559-42a5-b81b-c2219787cc5b\") " pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:28:10.091903 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:10.091883 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57bed6cb-14ec-45db-98ca-49d0e2a82730-metrics-tls\") pod \"dns-default-sdjvb\" (UID: \"57bed6cb-14ec-45db-98ca-49d0e2a82730\") " pod="openshift-dns/dns-default-sdjvb" Apr 24 21:28:10.098564 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:10.098538 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls\") pod \"image-registry-579b557bd8-ns9mf\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:28:10.169663 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:10.169628 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:28:10.187438 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:10.187400 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zdbnj" Apr 24 21:28:10.195315 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:10.195288 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-sdjvb" Apr 24 21:28:11.137470 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.137281 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-sdjvb"] Apr 24 21:28:11.155270 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:28:11.155231 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57bed6cb_14ec_45db_98ca_49d0e2a82730.slice/crio-8442add0318c6ab40da2b9837740f1fbea9b95005229217447d8a4d05e19cf07 WatchSource:0}: Error finding container 8442add0318c6ab40da2b9837740f1fbea9b95005229217447d8a4d05e19cf07: Status 404 returned error can't find the container with id 8442add0318c6ab40da2b9837740f1fbea9b95005229217447d8a4d05e19cf07 Apr 24 21:28:11.361363 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.361335 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zdbnj"] Apr 24 21:28:11.362823 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:28:11.362793 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78d3eb37_5559_42a5_b81b_c2219787cc5b.slice/crio-4cc28291bf1e2303300d12150d455a2329ed45b60336521bfd080232b6309e29 WatchSource:0}: Error finding container 4cc28291bf1e2303300d12150d455a2329ed45b60336521bfd080232b6309e29: Status 404 returned error can't find the container with id 4cc28291bf1e2303300d12150d455a2329ed45b60336521bfd080232b6309e29 Apr 24 21:28:11.364774 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.364655 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-579b557bd8-ns9mf"] Apr 24 21:28:11.367388 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:28:11.367364 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e08127c_5e30_4001_8d57_fe051bc907b1.slice/crio-737f5da6b9dac89450244187938c3a00ee09a678f45eed57cd94a1c9f544205d WatchSource:0}: Error finding container 737f5da6b9dac89450244187938c3a00ee09a678f45eed57cd94a1c9f544205d: Status 404 returned error can't find the container with id 737f5da6b9dac89450244187938c3a00ee09a678f45eed57cd94a1c9f544205d Apr 24 21:28:11.772684 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.772592 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" event={"ID":"2e08127c-5e30-4001-8d57-fe051bc907b1","Type":"ContainerStarted","Data":"b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d"} Apr 24 21:28:11.772684 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.772640 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" event={"ID":"2e08127c-5e30-4001-8d57-fe051bc907b1","Type":"ContainerStarted","Data":"737f5da6b9dac89450244187938c3a00ee09a678f45eed57cd94a1c9f544205d"} Apr 24 21:28:11.772909 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.772705 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:28:11.775644 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.775616 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sdjvb" event={"ID":"57bed6cb-14ec-45db-98ca-49d0e2a82730","Type":"ContainerStarted","Data":"8442add0318c6ab40da2b9837740f1fbea9b95005229217447d8a4d05e19cf07"} Apr 24 21:28:11.777505 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.777478 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zdbnj" event={"ID":"78d3eb37-5559-42a5-b81b-c2219787cc5b","Type":"ContainerStarted","Data":"4cc28291bf1e2303300d12150d455a2329ed45b60336521bfd080232b6309e29"} Apr 24 21:28:11.779348 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.779323 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rklj9" event={"ID":"151371d6-8756-495b-8181-7fdcb156d1f4","Type":"ContainerStarted","Data":"a60a96553135fe499aa15d813265b0e110e24c51e9585b928470d9f9bae8203b"} Apr 24 21:28:11.795381 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.795318 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" podStartSLOduration=48.79530324 podStartE2EDuration="48.79530324s" podCreationTimestamp="2026-04-24 21:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:11.79505203 +0000 UTC m=+50.904145513" watchObservedRunningTime="2026-04-24 21:28:11.79530324 +0000 UTC m=+50.904396715" Apr 24 21:28:11.814281 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:11.814227 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rklj9" podStartSLOduration=33.705553572 podStartE2EDuration="37.814211692s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:28:07.268547414 +0000 UTC m=+46.377640878" lastFinishedPulling="2026-04-24 21:28:11.37720553 +0000 UTC m=+50.486298998" observedRunningTime="2026-04-24 21:28:11.812812603 +0000 UTC m=+50.921906089" watchObservedRunningTime="2026-04-24 21:28:11.814211692 +0000 UTC m=+50.923305189" Apr 24 21:28:13.789793 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:13.789751 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sdjvb" event={"ID":"57bed6cb-14ec-45db-98ca-49d0e2a82730","Type":"ContainerStarted","Data":"a303335ecd03866b2215f2e7488850d51de58a1c0b7559be7ee8c0ff8b0e0173"} Apr 24 21:28:13.791531 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:13.791501 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zdbnj" event={"ID":"78d3eb37-5559-42a5-b81b-c2219787cc5b","Type":"ContainerStarted","Data":"233a4a3a22057ef997502d7da9b2b56f58778e7fde080f1059c5cec50e890c5d"} Apr 24 21:28:13.822000 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:13.821942 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zdbnj" podStartSLOduration=17.630185793 podStartE2EDuration="19.821922863s" podCreationTimestamp="2026-04-24 21:27:54 +0000 UTC" firstStartedPulling="2026-04-24 21:28:11.365315587 +0000 UTC m=+50.474409050" lastFinishedPulling="2026-04-24 21:28:13.557052643 +0000 UTC m=+52.666146120" observedRunningTime="2026-04-24 21:28:13.821795774 +0000 UTC m=+52.930889260" watchObservedRunningTime="2026-04-24 21:28:13.821922863 +0000 UTC m=+52.931016349" Apr 24 21:28:14.795766 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:14.795727 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-sdjvb" event={"ID":"57bed6cb-14ec-45db-98ca-49d0e2a82730","Type":"ContainerStarted","Data":"44436655cc0cd4cd067b29b9d04c02d55d7bcdb4997b1007e5803e01789b4a03"} Apr 24 21:28:14.796171 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:14.795830 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-sdjvb" Apr 24 21:28:14.815285 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:14.815243 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-sdjvb" podStartSLOduration=18.420772141 podStartE2EDuration="20.815229141s" podCreationTimestamp="2026-04-24 21:27:54 +0000 UTC" firstStartedPulling="2026-04-24 21:28:11.15795912 +0000 UTC m=+50.267052601" lastFinishedPulling="2026-04-24 21:28:13.552416138 +0000 UTC m=+52.661509601" observedRunningTime="2026-04-24 21:28:14.814977501 +0000 UTC m=+53.924070987" watchObservedRunningTime="2026-04-24 21:28:14.815229141 +0000 UTC m=+53.924322626" Apr 24 21:28:15.159345 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.159250 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-579b557bd8-ns9mf"] Apr 24 21:28:15.170454 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.170423 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs"] Apr 24 21:28:15.196989 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.196952 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs"] Apr 24 21:28:15.197164 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.197088 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs" Apr 24 21:28:15.200267 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.200239 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 24 21:28:15.200413 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.200239 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-26dxb\"" Apr 24 21:28:15.291420 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.291382 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dfkzt"] Apr 24 21:28:15.312608 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.312572 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5954c7d68c-jd96l"] Apr 24 21:28:15.312777 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.312739 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.315203 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.315160 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:28:15.315203 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.315193 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:28:15.315548 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.315530 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6qqhw\"" Apr 24 21:28:15.315715 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.315682 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:28:15.315957 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.315934 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:28:15.330010 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.329986 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a4a2f809-339a-4381-826d-07e74cd2ec89-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qxfjs\" (UID: \"a4a2f809-339a-4381-826d-07e74cd2ec89\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs" Apr 24 21:28:15.330860 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.330844 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dfkzt"] Apr 24 21:28:15.330902 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.330866 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5954c7d68c-jd96l"] Apr 24 21:28:15.330956 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.330947 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.430597 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430515 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a4a2f809-339a-4381-826d-07e74cd2ec89-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qxfjs\" (UID: \"a4a2f809-339a-4381-826d-07e74cd2ec89\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs" Apr 24 21:28:15.430597 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430570 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/628cebd3-36c6-447e-be5a-217fb026b917-installation-pull-secrets\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.430808 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430625 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/961f627d-2e0c-4ad8-95af-5e589bce04df-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.430808 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430668 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m8sv\" (UniqueName: \"kubernetes.io/projected/628cebd3-36c6-447e-be5a-217fb026b917-kube-api-access-2m8sv\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.430808 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430685 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/961f627d-2e0c-4ad8-95af-5e589bce04df-data-volume\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.430808 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430745 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/628cebd3-36c6-447e-be5a-217fb026b917-registry-certificates\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.430808 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430768 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/961f627d-2e0c-4ad8-95af-5e589bce04df-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.431001 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430825 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/628cebd3-36c6-447e-be5a-217fb026b917-trusted-ca\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.431001 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430874 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/628cebd3-36c6-447e-be5a-217fb026b917-image-registry-private-configuration\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.431001 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430891 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/628cebd3-36c6-447e-be5a-217fb026b917-registry-tls\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.431001 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430918 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/628cebd3-36c6-447e-be5a-217fb026b917-bound-sa-token\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.431001 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.430968 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7mpw\" (UniqueName: \"kubernetes.io/projected/961f627d-2e0c-4ad8-95af-5e589bce04df-kube-api-access-g7mpw\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.431221 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.431040 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/628cebd3-36c6-447e-be5a-217fb026b917-ca-trust-extracted\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.431221 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.431067 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/961f627d-2e0c-4ad8-95af-5e589bce04df-crio-socket\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.433289 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.433267 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/a4a2f809-339a-4381-826d-07e74cd2ec89-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qxfjs\" (UID: \"a4a2f809-339a-4381-826d-07e74cd2ec89\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs" Apr 24 21:28:15.506585 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.506541 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs" Apr 24 21:28:15.531781 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.531752 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/628cebd3-36c6-447e-be5a-217fb026b917-ca-trust-extracted\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.531927 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.531787 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/961f627d-2e0c-4ad8-95af-5e589bce04df-crio-socket\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.531927 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.531832 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/628cebd3-36c6-447e-be5a-217fb026b917-installation-pull-secrets\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.531927 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.531856 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/961f627d-2e0c-4ad8-95af-5e589bce04df-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.531927 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.531887 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m8sv\" (UniqueName: \"kubernetes.io/projected/628cebd3-36c6-447e-be5a-217fb026b917-kube-api-access-2m8sv\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.532110 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.531994 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/961f627d-2e0c-4ad8-95af-5e589bce04df-crio-socket\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.532110 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532065 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/961f627d-2e0c-4ad8-95af-5e589bce04df-data-volume\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.532176 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532117 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/628cebd3-36c6-447e-be5a-217fb026b917-registry-certificates\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.532176 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532144 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/961f627d-2e0c-4ad8-95af-5e589bce04df-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.532242 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532188 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/628cebd3-36c6-447e-be5a-217fb026b917-trusted-ca\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.532242 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532220 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/628cebd3-36c6-447e-be5a-217fb026b917-ca-trust-extracted\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.532325 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532225 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/628cebd3-36c6-447e-be5a-217fb026b917-image-registry-private-configuration\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.532325 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532281 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/628cebd3-36c6-447e-be5a-217fb026b917-registry-tls\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.532325 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532313 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/628cebd3-36c6-447e-be5a-217fb026b917-bound-sa-token\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.532470 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532338 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7mpw\" (UniqueName: \"kubernetes.io/projected/961f627d-2e0c-4ad8-95af-5e589bce04df-kube-api-access-g7mpw\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.532528 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532465 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/961f627d-2e0c-4ad8-95af-5e589bce04df-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.532585 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.532565 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/961f627d-2e0c-4ad8-95af-5e589bce04df-data-volume\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.533665 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.533635 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/628cebd3-36c6-447e-be5a-217fb026b917-trusted-ca\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.533814 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.533793 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/628cebd3-36c6-447e-be5a-217fb026b917-registry-certificates\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.534830 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.534808 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/961f627d-2e0c-4ad8-95af-5e589bce04df-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.534961 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.534933 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/628cebd3-36c6-447e-be5a-217fb026b917-installation-pull-secrets\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.535144 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.535122 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/628cebd3-36c6-447e-be5a-217fb026b917-image-registry-private-configuration\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.535338 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.535320 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/628cebd3-36c6-447e-be5a-217fb026b917-registry-tls\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.558634 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.558549 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m8sv\" (UniqueName: \"kubernetes.io/projected/628cebd3-36c6-447e-be5a-217fb026b917-kube-api-access-2m8sv\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.558781 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.558756 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7mpw\" (UniqueName: \"kubernetes.io/projected/961f627d-2e0c-4ad8-95af-5e589bce04df-kube-api-access-g7mpw\") pod \"insights-runtime-extractor-dfkzt\" (UID: \"961f627d-2e0c-4ad8-95af-5e589bce04df\") " pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.564156 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.564123 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/628cebd3-36c6-447e-be5a-217fb026b917-bound-sa-token\") pod \"image-registry-5954c7d68c-jd96l\" (UID: \"628cebd3-36c6-447e-be5a-217fb026b917\") " pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.611803 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.611774 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-tvnrx"] Apr 24 21:28:15.626277 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.626253 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dfkzt" Apr 24 21:28:15.638759 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.638736 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:15.638941 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:28:15.638915 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4a2f809_339a_4381_826d_07e74cd2ec89.slice/crio-51e1c5d8266734bdb3617538f3ead5249e3f10799219f6423804fa18eaf3bbdd WatchSource:0}: Error finding container 51e1c5d8266734bdb3617538f3ead5249e3f10799219f6423804fa18eaf3bbdd: Status 404 returned error can't find the container with id 51e1c5d8266734bdb3617538f3ead5249e3f10799219f6423804fa18eaf3bbdd Apr 24 21:28:15.648655 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.648633 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tvnrx"] Apr 24 21:28:15.648655 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.648659 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs"] Apr 24 21:28:15.648772 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.648750 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tvnrx" Apr 24 21:28:15.651843 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.651823 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-876td\"" Apr 24 21:28:15.651942 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.651891 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 21:28:15.652529 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.652506 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 21:28:15.767041 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.766996 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dfkzt"] Apr 24 21:28:15.770243 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:28:15.770215 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod961f627d_2e0c_4ad8_95af_5e589bce04df.slice/crio-cd76d18dd58867c077f7e29722451b2fc8ed65c6e3c473f151470892738ee866 WatchSource:0}: Error finding container cd76d18dd58867c077f7e29722451b2fc8ed65c6e3c473f151470892738ee866: Status 404 returned error can't find the container with id cd76d18dd58867c077f7e29722451b2fc8ed65c6e3c473f151470892738ee866 Apr 24 21:28:15.800364 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.800338 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs" event={"ID":"a4a2f809-339a-4381-826d-07e74cd2ec89","Type":"ContainerStarted","Data":"51e1c5d8266734bdb3617538f3ead5249e3f10799219f6423804fa18eaf3bbdd"} Apr 24 21:28:15.800673 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.800451 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5954c7d68c-jd96l"] Apr 24 21:28:15.801365 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.801332 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dfkzt" event={"ID":"961f627d-2e0c-4ad8-95af-5e589bce04df","Type":"ContainerStarted","Data":"cd76d18dd58867c077f7e29722451b2fc8ed65c6e3c473f151470892738ee866"} Apr 24 21:28:15.804989 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:28:15.804963 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod628cebd3_36c6_447e_be5a_217fb026b917.slice/crio-982913e8060b362c945accf9d5b3555de779262c1cf2ad41ae7be500680848ae WatchSource:0}: Error finding container 982913e8060b362c945accf9d5b3555de779262c1cf2ad41ae7be500680848ae: Status 404 returned error can't find the container with id 982913e8060b362c945accf9d5b3555de779262c1cf2ad41ae7be500680848ae Apr 24 21:28:15.835469 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.835444 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhzwd\" (UniqueName: \"kubernetes.io/projected/6a569780-c1f3-462e-87dd-d4b03fe11d70-kube-api-access-jhzwd\") pod \"downloads-6bcc868b7-tvnrx\" (UID: \"6a569780-c1f3-462e-87dd-d4b03fe11d70\") " pod="openshift-console/downloads-6bcc868b7-tvnrx" Apr 24 21:28:15.936413 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.936376 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhzwd\" (UniqueName: \"kubernetes.io/projected/6a569780-c1f3-462e-87dd-d4b03fe11d70-kube-api-access-jhzwd\") pod \"downloads-6bcc868b7-tvnrx\" (UID: \"6a569780-c1f3-462e-87dd-d4b03fe11d70\") " pod="openshift-console/downloads-6bcc868b7-tvnrx" Apr 24 21:28:15.947540 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.947518 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhzwd\" (UniqueName: \"kubernetes.io/projected/6a569780-c1f3-462e-87dd-d4b03fe11d70-kube-api-access-jhzwd\") pod \"downloads-6bcc868b7-tvnrx\" (UID: \"6a569780-c1f3-462e-87dd-d4b03fe11d70\") " pod="openshift-console/downloads-6bcc868b7-tvnrx" Apr 24 21:28:15.959625 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:15.959559 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tvnrx" Apr 24 21:28:16.096600 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:16.096572 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tvnrx"] Apr 24 21:28:16.099749 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:28:16.099718 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a569780_c1f3_462e_87dd_d4b03fe11d70.slice/crio-4dc9f5fad0f39a87df1279814e69a74535d385af089d1e0627a805a5a63680a7 WatchSource:0}: Error finding container 4dc9f5fad0f39a87df1279814e69a74535d385af089d1e0627a805a5a63680a7: Status 404 returned error can't find the container with id 4dc9f5fad0f39a87df1279814e69a74535d385af089d1e0627a805a5a63680a7 Apr 24 21:28:16.807292 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:16.807258 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dfkzt" event={"ID":"961f627d-2e0c-4ad8-95af-5e589bce04df","Type":"ContainerStarted","Data":"39b79c8b3f0e6ab3db3b6ed3f6941849900349c5300294f75042ec7adb65caff"} Apr 24 21:28:16.808530 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:16.808497 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tvnrx" event={"ID":"6a569780-c1f3-462e-87dd-d4b03fe11d70","Type":"ContainerStarted","Data":"4dc9f5fad0f39a87df1279814e69a74535d385af089d1e0627a805a5a63680a7"} Apr 24 21:28:16.810258 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:16.810234 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" event={"ID":"628cebd3-36c6-447e-be5a-217fb026b917","Type":"ContainerStarted","Data":"331c503af08a68c947877c7f6abeb5e3fac71576d9400c806f12ba9cd5ef2360"} Apr 24 21:28:16.810357 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:16.810261 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" event={"ID":"628cebd3-36c6-447e-be5a-217fb026b917","Type":"ContainerStarted","Data":"982913e8060b362c945accf9d5b3555de779262c1cf2ad41ae7be500680848ae"} Apr 24 21:28:16.810998 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:16.810976 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:16.838581 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:16.838522 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" podStartSLOduration=1.838502463 podStartE2EDuration="1.838502463s" podCreationTimestamp="2026-04-24 21:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:16.836607386 +0000 UTC m=+55.945700885" watchObservedRunningTime="2026-04-24 21:28:16.838502463 +0000 UTC m=+55.947595947" Apr 24 21:28:17.815914 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:17.815865 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs" event={"ID":"a4a2f809-339a-4381-826d-07e74cd2ec89","Type":"ContainerStarted","Data":"b5a7c02472e08901d8e4ba6593b4882580f01dfd1979a0a6eeac200e7c7d1a90"} Apr 24 21:28:17.816857 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:17.816839 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs" Apr 24 21:28:17.819313 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:17.819286 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dfkzt" event={"ID":"961f627d-2e0c-4ad8-95af-5e589bce04df","Type":"ContainerStarted","Data":"b4520b330cec5a7260ca5ccabdbca1baadabae181cc9b0bc24a91e20378b5fd3"} Apr 24 21:28:17.822827 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:17.822804 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs" Apr 24 21:28:17.832725 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:17.832675 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qxfjs" podStartSLOduration=1.195862394 podStartE2EDuration="2.832659398s" podCreationTimestamp="2026-04-24 21:28:15 +0000 UTC" firstStartedPulling="2026-04-24 21:28:15.652958905 +0000 UTC m=+54.762052371" lastFinishedPulling="2026-04-24 21:28:17.289755903 +0000 UTC m=+56.398849375" observedRunningTime="2026-04-24 21:28:17.831611063 +0000 UTC m=+56.940704550" watchObservedRunningTime="2026-04-24 21:28:17.832659398 +0000 UTC m=+56.941752885" Apr 24 21:28:18.823519 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.823491 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bcppr"] Apr 24 21:28:18.828124 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.828100 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.832255 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.832129 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 21:28:18.832433 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.832415 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:28:18.833273 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.833254 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:28:18.833381 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.833316 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:28:18.833451 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.833260 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-h4hwd\"" Apr 24 21:28:18.833539 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.833522 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 21:28:18.840558 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.840535 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bcppr"] Apr 24 21:28:18.859951 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.859915 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd73b28b-f395-46a8-b132-56d62ce821db-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.860148 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.860099 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd73b28b-f395-46a8-b132-56d62ce821db-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.860148 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.860139 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7fv6\" (UniqueName: \"kubernetes.io/projected/fd73b28b-f395-46a8-b132-56d62ce821db-kube-api-access-g7fv6\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.860268 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.860214 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fd73b28b-f395-46a8-b132-56d62ce821db-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.961115 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.961080 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd73b28b-f395-46a8-b132-56d62ce821db-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.961303 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.961139 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7fv6\" (UniqueName: \"kubernetes.io/projected/fd73b28b-f395-46a8-b132-56d62ce821db-kube-api-access-g7fv6\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.961303 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.961183 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fd73b28b-f395-46a8-b132-56d62ce821db-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.961303 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.961220 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd73b28b-f395-46a8-b132-56d62ce821db-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.961303 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:18.961248 2581 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 21:28:18.961497 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:18.961327 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd73b28b-f395-46a8-b132-56d62ce821db-prometheus-operator-tls podName:fd73b28b-f395-46a8-b132-56d62ce821db nodeName:}" failed. No retries permitted until 2026-04-24 21:28:19.461307551 +0000 UTC m=+58.570401032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/fd73b28b-f395-46a8-b132-56d62ce821db-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-bcppr" (UID: "fd73b28b-f395-46a8-b132-56d62ce821db") : secret "prometheus-operator-tls" not found Apr 24 21:28:18.962074 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.962046 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd73b28b-f395-46a8-b132-56d62ce821db-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.964048 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.964005 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fd73b28b-f395-46a8-b132-56d62ce821db-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:18.972788 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:18.972751 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7fv6\" (UniqueName: \"kubernetes.io/projected/fd73b28b-f395-46a8-b132-56d62ce821db-kube-api-access-g7fv6\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:19.465721 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:19.465684 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd73b28b-f395-46a8-b132-56d62ce821db-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:19.468710 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:19.468681 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd73b28b-f395-46a8-b132-56d62ce821db-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-bcppr\" (UID: \"fd73b28b-f395-46a8-b132-56d62ce821db\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:19.710903 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:19.710877 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8j4mf" Apr 24 21:28:19.740343 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:19.740274 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" Apr 24 21:28:19.835388 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:19.835335 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dfkzt" event={"ID":"961f627d-2e0c-4ad8-95af-5e589bce04df","Type":"ContainerStarted","Data":"f178a067d857ad5ac4c74ea6d3529773c2dc6cb5b2432f352bf419e0e6942c59"} Apr 24 21:28:19.859347 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:19.859304 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dfkzt" podStartSLOduration=1.958977524 podStartE2EDuration="4.859288082s" podCreationTimestamp="2026-04-24 21:28:15 +0000 UTC" firstStartedPulling="2026-04-24 21:28:15.860243201 +0000 UTC m=+54.969336664" lastFinishedPulling="2026-04-24 21:28:18.760553742 +0000 UTC m=+57.869647222" observedRunningTime="2026-04-24 21:28:19.858733268 +0000 UTC m=+58.967826758" watchObservedRunningTime="2026-04-24 21:28:19.859288082 +0000 UTC m=+58.968381567" Apr 24 21:28:19.905872 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:19.905837 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-bcppr"] Apr 24 21:28:19.912344 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:28:19.912309 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd73b28b_f395_46a8_b132_56d62ce821db.slice/crio-4b7e7c8533c8f5692c73529f91f50e1815a7578e622c7687df122b56c0bd1185 WatchSource:0}: Error finding container 4b7e7c8533c8f5692c73529f91f50e1815a7578e622c7687df122b56c0bd1185: Status 404 returned error can't find the container with id 4b7e7c8533c8f5692c73529f91f50e1815a7578e622c7687df122b56c0bd1185 Apr 24 21:28:20.839643 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:20.839583 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" event={"ID":"fd73b28b-f395-46a8-b132-56d62ce821db","Type":"ContainerStarted","Data":"4b7e7c8533c8f5692c73529f91f50e1815a7578e622c7687df122b56c0bd1185"} Apr 24 21:28:21.844149 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:21.844105 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" event={"ID":"fd73b28b-f395-46a8-b132-56d62ce821db","Type":"ContainerStarted","Data":"f552389d4378b67959e6ef2fa278c5af1ea969c8741f7445827482655fa85389"} Apr 24 21:28:21.844149 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:21.844153 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" event={"ID":"fd73b28b-f395-46a8-b132-56d62ce821db","Type":"ContainerStarted","Data":"596fdadefbf7478b47e5d501cfea85405ed85ca3f4a68f8cf84c9ed89078d7b6"} Apr 24 21:28:21.863306 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:21.863253 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-bcppr" podStartSLOduration=2.554497314 podStartE2EDuration="3.863233441s" podCreationTimestamp="2026-04-24 21:28:18 +0000 UTC" firstStartedPulling="2026-04-24 21:28:19.91463822 +0000 UTC m=+59.023731684" lastFinishedPulling="2026-04-24 21:28:21.223374335 +0000 UTC m=+60.332467811" observedRunningTime="2026-04-24 21:28:21.862660314 +0000 UTC m=+60.971753800" watchObservedRunningTime="2026-04-24 21:28:21.863233441 +0000 UTC m=+60.972326928" Apr 24 21:28:24.803664 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:24.803624 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-sdjvb" Apr 24 21:28:27.332949 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:27.332911 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:28:27.335777 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:27.335751 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5dfd7cf1-e10a-410e-b412-be269391a904-metrics-certs\") pod \"network-metrics-daemon-489tz\" (UID: \"5dfd7cf1-e10a-410e-b412-be269391a904\") " pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:28:27.540426 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:27.540392 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-cbm9w\"" Apr 24 21:28:27.548271 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:27.548239 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-489tz" Apr 24 21:28:28.435079 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.435047 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pj7mx"] Apr 24 21:28:28.470291 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.470264 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.472654 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.472617 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:28:28.472868 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.472853 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:28:28.473117 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.473101 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kvvhf\"" Apr 24 21:28:28.473288 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.473274 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:28:28.544575 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.544283 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.544575 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.544342 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9159faba-fd40-4471-843c-488aef676c4e-sys\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.544575 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.544384 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9159faba-fd40-4471-843c-488aef676c4e-root\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.544575 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.544416 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-wtmp\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.544575 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.544457 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9159faba-fd40-4471-843c-488aef676c4e-metrics-client-ca\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.545223 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.545012 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-tls\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.545223 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.545086 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cszvn\" (UniqueName: \"kubernetes.io/projected/9159faba-fd40-4471-843c-488aef676c4e-kube-api-access-cszvn\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.545223 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.545120 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-textfile\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.545223 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.545148 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-accelerators-collector-config\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646066 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646009 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9159faba-fd40-4471-843c-488aef676c4e-metrics-client-ca\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646244 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646073 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-tls\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646244 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646104 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cszvn\" (UniqueName: \"kubernetes.io/projected/9159faba-fd40-4471-843c-488aef676c4e-kube-api-access-cszvn\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646244 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646135 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-textfile\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646244 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646164 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-accelerators-collector-config\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646244 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646220 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646501 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646257 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9159faba-fd40-4471-843c-488aef676c4e-sys\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646501 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646294 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9159faba-fd40-4471-843c-488aef676c4e-root\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646501 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646320 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-wtmp\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646501 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646480 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-wtmp\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.646698 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:28.646582 2581 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:28:28.646698 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:28.646638 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-tls podName:9159faba-fd40-4471-843c-488aef676c4e nodeName:}" failed. No retries permitted until 2026-04-24 21:28:29.146618463 +0000 UTC m=+68.255711931 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-tls") pod "node-exporter-pj7mx" (UID: "9159faba-fd40-4471-843c-488aef676c4e") : secret "node-exporter-tls" not found Apr 24 21:28:28.646698 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646643 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9159faba-fd40-4471-843c-488aef676c4e-metrics-client-ca\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.647066 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646930 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9159faba-fd40-4471-843c-488aef676c4e-sys\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.647066 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.646993 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9159faba-fd40-4471-843c-488aef676c4e-root\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.647246 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.647226 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-textfile\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.647469 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.647449 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-accelerators-collector-config\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.652044 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.649503 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:28.660937 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:28.660915 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cszvn\" (UniqueName: \"kubernetes.io/projected/9159faba-fd40-4471-843c-488aef676c4e-kube-api-access-cszvn\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:29.151516 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:29.151475 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-tls\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:29.154329 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:29.154306 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9159faba-fd40-4471-843c-488aef676c4e-node-exporter-tls\") pod \"node-exporter-pj7mx\" (UID: \"9159faba-fd40-4471-843c-488aef676c4e\") " pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:29.383166 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:29.383131 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pj7mx" Apr 24 21:28:31.751773 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:31.751734 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2f65f" Apr 24 21:28:32.650455 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:28:32.650422 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9159faba_fd40_4471_843c_488aef676c4e.slice/crio-d3e2827fc776982069370259df3aa5c5c4ef27768d2470e31677e2fd3756d20e WatchSource:0}: Error finding container d3e2827fc776982069370259df3aa5c5c4ef27768d2470e31677e2fd3756d20e: Status 404 returned error can't find the container with id d3e2827fc776982069370259df3aa5c5c4ef27768d2470e31677e2fd3756d20e Apr 24 21:28:32.768534 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:32.768494 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-489tz"] Apr 24 21:28:32.855311 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:28:32.855271 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dfd7cf1_e10a_410e_b412_be269391a904.slice/crio-b757e2620d32a8078d7eb512180ba7c520243b6d8281cd9f317da07e2f42eb61 WatchSource:0}: Error finding container b757e2620d32a8078d7eb512180ba7c520243b6d8281cd9f317da07e2f42eb61: Status 404 returned error can't find the container with id b757e2620d32a8078d7eb512180ba7c520243b6d8281cd9f317da07e2f42eb61 Apr 24 21:28:32.877554 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:32.877522 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-489tz" event={"ID":"5dfd7cf1-e10a-410e-b412-be269391a904","Type":"ContainerStarted","Data":"b757e2620d32a8078d7eb512180ba7c520243b6d8281cd9f317da07e2f42eb61"} Apr 24 21:28:32.878474 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:32.878450 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pj7mx" event={"ID":"9159faba-fd40-4471-843c-488aef676c4e","Type":"ContainerStarted","Data":"d3e2827fc776982069370259df3aa5c5c4ef27768d2470e31677e2fd3756d20e"} Apr 24 21:28:33.885297 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:33.885186 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tvnrx" event={"ID":"6a569780-c1f3-462e-87dd-d4b03fe11d70","Type":"ContainerStarted","Data":"e33b431ec2f1ad284db32cf55716fd20a104d378c600cf1a914e773bab251cdb"} Apr 24 21:28:33.886011 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:33.885494 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-tvnrx" Apr 24 21:28:33.906232 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:33.906181 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-tvnrx" podStartSLOduration=2.089545872 podStartE2EDuration="18.906163586s" podCreationTimestamp="2026-04-24 21:28:15 +0000 UTC" firstStartedPulling="2026-04-24 21:28:16.103667843 +0000 UTC m=+55.212761307" lastFinishedPulling="2026-04-24 21:28:32.920285554 +0000 UTC m=+72.029379021" observedRunningTime="2026-04-24 21:28:33.904197481 +0000 UTC m=+73.013290967" watchObservedRunningTime="2026-04-24 21:28:33.906163586 +0000 UTC m=+73.015257075" Apr 24 21:28:33.913042 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:33.912765 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-tvnrx" Apr 24 21:28:34.893491 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:34.893456 2581 generic.go:358] "Generic (PLEG): container finished" podID="9159faba-fd40-4471-843c-488aef676c4e" containerID="00e83c112487a6f6004c1e4273642b396d25a884dda7a1ff00ab48bb5e0c4e08" exitCode=0 Apr 24 21:28:34.894460 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:34.894437 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pj7mx" event={"ID":"9159faba-fd40-4471-843c-488aef676c4e","Type":"ContainerDied","Data":"00e83c112487a6f6004c1e4273642b396d25a884dda7a1ff00ab48bb5e0c4e08"} Apr 24 21:28:35.167557 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:35.167476 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:28:35.899005 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:35.898959 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-489tz" event={"ID":"5dfd7cf1-e10a-410e-b412-be269391a904","Type":"ContainerStarted","Data":"fc777d6ef25af92fa97da2b26a4e711f0dca75018114edff04649b3b70d62ce0"} Apr 24 21:28:35.899466 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:35.899010 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-489tz" event={"ID":"5dfd7cf1-e10a-410e-b412-be269391a904","Type":"ContainerStarted","Data":"f21be66cb433eafce060f83ff55145d62efedbd46def5f4f9378fff06669a889"} Apr 24 21:28:35.901289 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:35.901261 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pj7mx" event={"ID":"9159faba-fd40-4471-843c-488aef676c4e","Type":"ContainerStarted","Data":"c4a1c61617e2b674ed5fd0f2c8fbe82186a7ebba406a4d55d2f5bb50eea79d1b"} Apr 24 21:28:35.901413 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:35.901295 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pj7mx" event={"ID":"9159faba-fd40-4471-843c-488aef676c4e","Type":"ContainerStarted","Data":"c15af87a9774ffd89c9c8252fbeea992aec8c49a1063506a8640a47359056502"} Apr 24 21:28:35.917797 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:35.917745 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-489tz" podStartSLOduration=72.850212224 podStartE2EDuration="1m14.917732375s" podCreationTimestamp="2026-04-24 21:27:21 +0000 UTC" firstStartedPulling="2026-04-24 21:28:32.874689749 +0000 UTC m=+71.983783231" lastFinishedPulling="2026-04-24 21:28:34.942209906 +0000 UTC m=+74.051303382" observedRunningTime="2026-04-24 21:28:35.915807897 +0000 UTC m=+75.024901406" watchObservedRunningTime="2026-04-24 21:28:35.917732375 +0000 UTC m=+75.026825860" Apr 24 21:28:35.939441 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:35.939385 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pj7mx" podStartSLOduration=6.843338757 podStartE2EDuration="7.9393683s" podCreationTimestamp="2026-04-24 21:28:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:32.651771936 +0000 UTC m=+71.760865403" lastFinishedPulling="2026-04-24 21:28:33.74780148 +0000 UTC m=+72.856894946" observedRunningTime="2026-04-24 21:28:35.93882589 +0000 UTC m=+75.047919402" watchObservedRunningTime="2026-04-24 21:28:35.9393683 +0000 UTC m=+75.048461805" Apr 24 21:28:38.828313 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:38.828191 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5954c7d68c-jd96l" Apr 24 21:28:40.178323 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.178259 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" podUID="2e08127c-5e30-4001-8d57-fe051bc907b1" containerName="registry" containerID="cri-o://b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d" gracePeriod=30 Apr 24 21:28:40.469914 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.469886 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:28:40.550256 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.550223 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-image-registry-private-configuration\") pod \"2e08127c-5e30-4001-8d57-fe051bc907b1\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " Apr 24 21:28:40.550442 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.550283 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdhd5\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-kube-api-access-xdhd5\") pod \"2e08127c-5e30-4001-8d57-fe051bc907b1\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " Apr 24 21:28:40.550442 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.550323 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2e08127c-5e30-4001-8d57-fe051bc907b1-ca-trust-extracted\") pod \"2e08127c-5e30-4001-8d57-fe051bc907b1\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " Apr 24 21:28:40.550442 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.550351 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls\") pod \"2e08127c-5e30-4001-8d57-fe051bc907b1\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " Apr 24 21:28:40.550442 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.550405 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-trusted-ca\") pod \"2e08127c-5e30-4001-8d57-fe051bc907b1\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " Apr 24 21:28:40.550442 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.550432 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-installation-pull-secrets\") pod \"2e08127c-5e30-4001-8d57-fe051bc907b1\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " Apr 24 21:28:40.550665 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.550454 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-bound-sa-token\") pod \"2e08127c-5e30-4001-8d57-fe051bc907b1\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " Apr 24 21:28:40.550665 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.550504 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-certificates\") pod \"2e08127c-5e30-4001-8d57-fe051bc907b1\" (UID: \"2e08127c-5e30-4001-8d57-fe051bc907b1\") " Apr 24 21:28:40.550988 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.550959 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2e08127c-5e30-4001-8d57-fe051bc907b1" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:40.551235 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.551197 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2e08127c-5e30-4001-8d57-fe051bc907b1" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:28:40.553466 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.553422 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2e08127c-5e30-4001-8d57-fe051bc907b1" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:40.554623 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.554597 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-kube-api-access-xdhd5" (OuterVolumeSpecName: "kube-api-access-xdhd5") pod "2e08127c-5e30-4001-8d57-fe051bc907b1" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1"). InnerVolumeSpecName "kube-api-access-xdhd5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:40.554720 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.554685 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2e08127c-5e30-4001-8d57-fe051bc907b1" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:40.555332 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.555286 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2e08127c-5e30-4001-8d57-fe051bc907b1" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:28:40.556221 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.556189 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2e08127c-5e30-4001-8d57-fe051bc907b1" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:28:40.561420 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.561393 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e08127c-5e30-4001-8d57-fe051bc907b1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2e08127c-5e30-4001-8d57-fe051bc907b1" (UID: "2e08127c-5e30-4001-8d57-fe051bc907b1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:28:40.651038 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.650995 2581 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-certificates\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:28:40.651199 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.651055 2581 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-image-registry-private-configuration\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:28:40.651199 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.651072 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xdhd5\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-kube-api-access-xdhd5\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:28:40.651199 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.651086 2581 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2e08127c-5e30-4001-8d57-fe051bc907b1-ca-trust-extracted\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:28:40.651199 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.651102 2581 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-registry-tls\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:28:40.651199 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.651114 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2e08127c-5e30-4001-8d57-fe051bc907b1-trusted-ca\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:28:40.651199 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.651122 2581 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2e08127c-5e30-4001-8d57-fe051bc907b1-installation-pull-secrets\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:28:40.651199 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.651131 2581 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2e08127c-5e30-4001-8d57-fe051bc907b1-bound-sa-token\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:28:40.920950 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.920915 2581 generic.go:358] "Generic (PLEG): container finished" podID="2e08127c-5e30-4001-8d57-fe051bc907b1" containerID="b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d" exitCode=0 Apr 24 21:28:40.921169 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.920987 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" event={"ID":"2e08127c-5e30-4001-8d57-fe051bc907b1","Type":"ContainerDied","Data":"b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d"} Apr 24 21:28:40.921169 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.921035 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" event={"ID":"2e08127c-5e30-4001-8d57-fe051bc907b1","Type":"ContainerDied","Data":"737f5da6b9dac89450244187938c3a00ee09a678f45eed57cd94a1c9f544205d"} Apr 24 21:28:40.921169 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.921057 2581 scope.go:117] "RemoveContainer" containerID="b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d" Apr 24 21:28:40.921169 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.920995 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-579b557bd8-ns9mf" Apr 24 21:28:40.930977 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.930955 2581 scope.go:117] "RemoveContainer" containerID="b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d" Apr 24 21:28:40.931360 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:28:40.931328 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d\": container with ID starting with b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d not found: ID does not exist" containerID="b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d" Apr 24 21:28:40.931452 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.931362 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d"} err="failed to get container status \"b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d\": rpc error: code = NotFound desc = could not find container \"b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d\": container with ID starting with b81f0ebf9b77c93343fb00c53ec708b36381078dbe4291a9293559e044dc976d not found: ID does not exist" Apr 24 21:28:40.945079 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.945015 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-579b557bd8-ns9mf"] Apr 24 21:28:40.953884 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:40.953852 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-579b557bd8-ns9mf"] Apr 24 21:28:41.524328 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:41.524298 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e08127c-5e30-4001-8d57-fe051bc907b1" path="/var/lib/kubelet/pods/2e08127c-5e30-4001-8d57-fe051bc907b1/volumes" Apr 24 21:28:59.918857 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:28:59.918826 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zdbnj_78d3eb37-5559-42a5-b81b-c2219787cc5b/serve-healthcheck-canary/0.log" Apr 24 21:30:52.373557 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.373518 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2"] Apr 24 21:30:52.374218 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.373888 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e08127c-5e30-4001-8d57-fe051bc907b1" containerName="registry" Apr 24 21:30:52.374218 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.373907 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e08127c-5e30-4001-8d57-fe051bc907b1" containerName="registry" Apr 24 21:30:52.374218 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.373964 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e08127c-5e30-4001-8d57-fe051bc907b1" containerName="registry" Apr 24 21:30:52.376884 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.376861 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.380364 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.380340 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 21:30:52.380574 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.380560 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 21:30:52.381298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.381284 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-k2khp\"" Apr 24 21:30:52.389994 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.389972 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2"] Apr 24 21:30:52.481655 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.481616 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.481845 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.481673 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbxgk\" (UniqueName: \"kubernetes.io/projected/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-kube-api-access-lbxgk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.481845 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.481712 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.582252 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.582211 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.582428 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.582262 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbxgk\" (UniqueName: \"kubernetes.io/projected/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-kube-api-access-lbxgk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.582428 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.582289 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.582613 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.582596 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.582690 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.582665 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.591247 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.591223 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbxgk\" (UniqueName: \"kubernetes.io/projected/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-kube-api-access-lbxgk\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.686385 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.686299 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:30:52.805474 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:52.805457 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2"] Apr 24 21:30:52.808092 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:30:52.808058 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99f21298_be0c_4e0b_b376_a53e3d4b3c2b.slice/crio-f2ee6a84a92b29534f06f99f2f30251cc0af2c7fcf1639c54c941addf830bbc8 WatchSource:0}: Error finding container f2ee6a84a92b29534f06f99f2f30251cc0af2c7fcf1639c54c941addf830bbc8: Status 404 returned error can't find the container with id f2ee6a84a92b29534f06f99f2f30251cc0af2c7fcf1639c54c941addf830bbc8 Apr 24 21:30:53.274305 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:53.274261 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" event={"ID":"99f21298-be0c-4e0b-b376-a53e3d4b3c2b","Type":"ContainerStarted","Data":"f2ee6a84a92b29534f06f99f2f30251cc0af2c7fcf1639c54c941addf830bbc8"} Apr 24 21:30:58.291361 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:58.291321 2581 generic.go:358] "Generic (PLEG): container finished" podID="99f21298-be0c-4e0b-b376-a53e3d4b3c2b" containerID="d33b99c81c7457842e0f8dd2ef257bf6defa57738014519162946a1092842b2c" exitCode=0 Apr 24 21:30:58.291802 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:30:58.291418 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" event={"ID":"99f21298-be0c-4e0b-b376-a53e3d4b3c2b","Type":"ContainerDied","Data":"d33b99c81c7457842e0f8dd2ef257bf6defa57738014519162946a1092842b2c"} Apr 24 21:31:05.310463 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:05.310422 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" event={"ID":"99f21298-be0c-4e0b-b376-a53e3d4b3c2b","Type":"ContainerStarted","Data":"efce5f7ffb77d9d11e344c18b3bd3cf5c0034fcf7c6e7c605dae1d357d545582"} Apr 24 21:31:06.314575 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:06.314539 2581 generic.go:358] "Generic (PLEG): container finished" podID="99f21298-be0c-4e0b-b376-a53e3d4b3c2b" containerID="efce5f7ffb77d9d11e344c18b3bd3cf5c0034fcf7c6e7c605dae1d357d545582" exitCode=0 Apr 24 21:31:06.314934 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:06.314590 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" event={"ID":"99f21298-be0c-4e0b-b376-a53e3d4b3c2b","Type":"ContainerDied","Data":"efce5f7ffb77d9d11e344c18b3bd3cf5c0034fcf7c6e7c605dae1d357d545582"} Apr 24 21:31:12.336660 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:12.336626 2581 generic.go:358] "Generic (PLEG): container finished" podID="99f21298-be0c-4e0b-b376-a53e3d4b3c2b" containerID="0cc5ada1909051223b86f42e212ed98e8f18142541012a1a502da3daea6254e6" exitCode=0 Apr 24 21:31:12.337054 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:12.336715 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" event={"ID":"99f21298-be0c-4e0b-b376-a53e3d4b3c2b","Type":"ContainerDied","Data":"0cc5ada1909051223b86f42e212ed98e8f18142541012a1a502da3daea6254e6"} Apr 24 21:31:13.465153 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:13.465132 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:31:13.552537 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:13.552507 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-util\") pod \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " Apr 24 21:31:13.552693 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:13.552544 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-bundle\") pod \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " Apr 24 21:31:13.552693 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:13.552606 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbxgk\" (UniqueName: \"kubernetes.io/projected/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-kube-api-access-lbxgk\") pod \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\" (UID: \"99f21298-be0c-4e0b-b376-a53e3d4b3c2b\") " Apr 24 21:31:13.553164 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:13.553138 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-bundle" (OuterVolumeSpecName: "bundle") pod "99f21298-be0c-4e0b-b376-a53e3d4b3c2b" (UID: "99f21298-be0c-4e0b-b376-a53e3d4b3c2b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:13.554832 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:13.554814 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-kube-api-access-lbxgk" (OuterVolumeSpecName: "kube-api-access-lbxgk") pod "99f21298-be0c-4e0b-b376-a53e3d4b3c2b" (UID: "99f21298-be0c-4e0b-b376-a53e3d4b3c2b"). InnerVolumeSpecName "kube-api-access-lbxgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:31:13.557486 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:13.557459 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-util" (OuterVolumeSpecName: "util") pod "99f21298-be0c-4e0b-b376-a53e3d4b3c2b" (UID: "99f21298-be0c-4e0b-b376-a53e3d4b3c2b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:31:13.653576 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:13.653506 2581 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-util\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:31:13.653576 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:13.653532 2581 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-bundle\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:31:13.653576 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:13.653542 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbxgk\" (UniqueName: \"kubernetes.io/projected/99f21298-be0c-4e0b-b376-a53e3d4b3c2b-kube-api-access-lbxgk\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:31:14.343495 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:14.343468 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" Apr 24 21:31:14.343656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:14.343465 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cl5xm2" event={"ID":"99f21298-be0c-4e0b-b376-a53e3d4b3c2b","Type":"ContainerDied","Data":"f2ee6a84a92b29534f06f99f2f30251cc0af2c7fcf1639c54c941addf830bbc8"} Apr 24 21:31:14.343656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:14.343579 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2ee6a84a92b29534f06f99f2f30251cc0af2c7fcf1639c54c941addf830bbc8" Apr 24 21:31:19.868444 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.868404 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d"] Apr 24 21:31:19.868835 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.868677 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99f21298-be0c-4e0b-b376-a53e3d4b3c2b" containerName="extract" Apr 24 21:31:19.868835 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.868689 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f21298-be0c-4e0b-b376-a53e3d4b3c2b" containerName="extract" Apr 24 21:31:19.868835 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.868700 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99f21298-be0c-4e0b-b376-a53e3d4b3c2b" containerName="util" Apr 24 21:31:19.868835 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.868705 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f21298-be0c-4e0b-b376-a53e3d4b3c2b" containerName="util" Apr 24 21:31:19.868835 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.868714 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99f21298-be0c-4e0b-b376-a53e3d4b3c2b" containerName="pull" Apr 24 21:31:19.868835 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.868720 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="99f21298-be0c-4e0b-b376-a53e3d4b3c2b" containerName="pull" Apr 24 21:31:19.868835 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.868761 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="99f21298-be0c-4e0b-b376-a53e3d4b3c2b" containerName="extract" Apr 24 21:31:19.872645 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.872629 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" Apr 24 21:31:19.875087 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.875066 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-vwwbz\"" Apr 24 21:31:19.875402 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.875384 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 21:31:19.875402 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.875398 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 21:31:19.875767 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.875749 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 21:31:19.886933 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.886908 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d"] Apr 24 21:31:19.999507 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.999474 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/97a5d6bc-37c8-4010-88ff-d1fddbd0898f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d\" (UID: \"97a5d6bc-37c8-4010-88ff-d1fddbd0898f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" Apr 24 21:31:19.999680 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:19.999528 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m9wh\" (UniqueName: \"kubernetes.io/projected/97a5d6bc-37c8-4010-88ff-d1fddbd0898f-kube-api-access-6m9wh\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d\" (UID: \"97a5d6bc-37c8-4010-88ff-d1fddbd0898f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" Apr 24 21:31:20.100369 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:20.100340 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/97a5d6bc-37c8-4010-88ff-d1fddbd0898f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d\" (UID: \"97a5d6bc-37c8-4010-88ff-d1fddbd0898f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" Apr 24 21:31:20.100510 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:20.100389 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6m9wh\" (UniqueName: \"kubernetes.io/projected/97a5d6bc-37c8-4010-88ff-d1fddbd0898f-kube-api-access-6m9wh\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d\" (UID: \"97a5d6bc-37c8-4010-88ff-d1fddbd0898f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" Apr 24 21:31:20.102773 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:20.102743 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/97a5d6bc-37c8-4010-88ff-d1fddbd0898f-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d\" (UID: \"97a5d6bc-37c8-4010-88ff-d1fddbd0898f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" Apr 24 21:31:20.116786 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:20.116759 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m9wh\" (UniqueName: \"kubernetes.io/projected/97a5d6bc-37c8-4010-88ff-d1fddbd0898f-kube-api-access-6m9wh\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d\" (UID: \"97a5d6bc-37c8-4010-88ff-d1fddbd0898f\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" Apr 24 21:31:20.185715 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:20.185645 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" Apr 24 21:31:20.316304 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:20.316281 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d"] Apr 24 21:31:20.317944 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:31:20.317915 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a5d6bc_37c8_4010_88ff_d1fddbd0898f.slice/crio-eb258642d23c1e19ee5fb504a730ea1e419d69b25b0a9ef4a140a2df958671e0 WatchSource:0}: Error finding container eb258642d23c1e19ee5fb504a730ea1e419d69b25b0a9ef4a140a2df958671e0: Status 404 returned error can't find the container with id eb258642d23c1e19ee5fb504a730ea1e419d69b25b0a9ef4a140a2df958671e0 Apr 24 21:31:20.360948 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:20.360909 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" event={"ID":"97a5d6bc-37c8-4010-88ff-d1fddbd0898f","Type":"ContainerStarted","Data":"eb258642d23c1e19ee5fb504a730ea1e419d69b25b0a9ef4a140a2df958671e0"} Apr 24 21:31:23.370550 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.370512 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" event={"ID":"97a5d6bc-37c8-4010-88ff-d1fddbd0898f","Type":"ContainerStarted","Data":"2bd30268c3c8dbd5f5d69069ef951e543bba2ad8ae765dffb46ff43c3080760a"} Apr 24 21:31:23.370936 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.370657 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" Apr 24 21:31:23.393819 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.393755 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" podStartSLOduration=1.5019458719999998 podStartE2EDuration="4.39373718s" podCreationTimestamp="2026-04-24 21:31:19 +0000 UTC" firstStartedPulling="2026-04-24 21:31:20.319626692 +0000 UTC m=+239.428720155" lastFinishedPulling="2026-04-24 21:31:23.211417783 +0000 UTC m=+242.320511463" observedRunningTime="2026-04-24 21:31:23.393114179 +0000 UTC m=+242.502207664" watchObservedRunningTime="2026-04-24 21:31:23.39373718 +0000 UTC m=+242.502830671" Apr 24 21:31:23.832549 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.832512 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-xv6mh"] Apr 24 21:31:23.835668 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.835651 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:23.838611 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.838588 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 21:31:23.838723 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.838588 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-bzzwh\"" Apr 24 21:31:23.838847 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.838835 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 21:31:23.845307 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.845285 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-xv6mh"] Apr 24 21:31:23.930770 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.930740 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkcfr\" (UniqueName: \"kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-kube-api-access-nkcfr\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:23.930941 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.930783 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:23.930941 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:23.930801 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1b6280f7-1c9b-460d-88e4-0de2f0be886d-cabundle0\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:24.032201 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.032164 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nkcfr\" (UniqueName: \"kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-kube-api-access-nkcfr\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:24.032201 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.032207 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:24.032430 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.032223 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1b6280f7-1c9b-460d-88e4-0de2f0be886d-cabundle0\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:24.032430 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.032328 2581 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 24 21:31:24.032430 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.032353 2581 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:31:24.032430 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.032364 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:31:24.032430 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.032379 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-xv6mh: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:31:24.032610 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.032451 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates podName:1b6280f7-1c9b-460d-88e4-0de2f0be886d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:24.532431647 +0000 UTC m=+243.641525114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates") pod "keda-operator-ffbb595cb-xv6mh" (UID: "1b6280f7-1c9b-460d-88e4-0de2f0be886d") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 24 21:31:24.032822 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.032805 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/1b6280f7-1c9b-460d-88e4-0de2f0be886d-cabundle0\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:24.050876 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.050852 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkcfr\" (UniqueName: \"kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-kube-api-access-nkcfr\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:24.283984 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.283952 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn"] Apr 24 21:31:24.287584 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.287561 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:24.290145 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.290123 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 21:31:24.305377 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.305355 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn"] Apr 24 21:31:24.435666 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.435634 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e4e605c8-403a-467c-ab81-a53be22351c6-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:24.435666 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.435679 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:24.436180 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.435711 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdjjx\" (UniqueName: \"kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-kube-api-access-kdjjx\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:24.536291 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.536197 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:24.536291 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.536244 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e4e605c8-403a-467c-ab81-a53be22351c6-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:24.536515 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.536349 2581 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:31:24.536515 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.536376 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:31:24.536515 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.536388 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-xv6mh: references non-existent secret key: ca.crt Apr 24 21:31:24.536515 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.536386 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:24.536515 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.536422 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kdjjx\" (UniqueName: \"kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-kube-api-access-kdjjx\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:24.536515 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.536447 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates podName:1b6280f7-1c9b-460d-88e4-0de2f0be886d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:25.536426952 +0000 UTC m=+244.645520428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates") pod "keda-operator-ffbb595cb-xv6mh" (UID: "1b6280f7-1c9b-460d-88e4-0de2f0be886d") : references non-existent secret key: ca.crt Apr 24 21:31:24.536865 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.536525 2581 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:31:24.536865 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.536546 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:31:24.536865 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.536568 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn: references non-existent secret key: tls.crt Apr 24 21:31:24.536865 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.536615 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates podName:e4e605c8-403a-467c-ab81-a53be22351c6 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:25.03659845 +0000 UTC m=+244.145691917 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates") pod "keda-metrics-apiserver-7c9f485588-w67kn" (UID: "e4e605c8-403a-467c-ab81-a53be22351c6") : references non-existent secret key: tls.crt Apr 24 21:31:24.536865 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.536614 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/e4e605c8-403a-467c-ab81-a53be22351c6-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:24.552582 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.552549 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdjjx\" (UniqueName: \"kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-kube-api-access-kdjjx\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:24.557475 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.557450 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-d7zsp"] Apr 24 21:31:24.560896 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.560880 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:24.564596 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.564580 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 21:31:24.585961 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.585938 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-d7zsp"] Apr 24 21:31:24.738296 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.738237 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e160702f-e9c0-4c05-96b6-712bc9933498-certificates\") pod \"keda-admission-cf49989db-d7zsp\" (UID: \"e160702f-e9c0-4c05-96b6-712bc9933498\") " pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:24.738498 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.738327 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cwft\" (UniqueName: \"kubernetes.io/projected/e160702f-e9c0-4c05-96b6-712bc9933498-kube-api-access-9cwft\") pod \"keda-admission-cf49989db-d7zsp\" (UID: \"e160702f-e9c0-4c05-96b6-712bc9933498\") " pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:24.839656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.839562 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e160702f-e9c0-4c05-96b6-712bc9933498-certificates\") pod \"keda-admission-cf49989db-d7zsp\" (UID: \"e160702f-e9c0-4c05-96b6-712bc9933498\") " pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:24.839656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.839621 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cwft\" (UniqueName: \"kubernetes.io/projected/e160702f-e9c0-4c05-96b6-712bc9933498-kube-api-access-9cwft\") pod \"keda-admission-cf49989db-d7zsp\" (UID: \"e160702f-e9c0-4c05-96b6-712bc9933498\") " pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:24.839891 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.839723 2581 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 24 21:31:24.839891 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.839750 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-d7zsp: secret "keda-admission-webhooks-certs" not found Apr 24 21:31:24.839891 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:24.839812 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e160702f-e9c0-4c05-96b6-712bc9933498-certificates podName:e160702f-e9c0-4c05-96b6-712bc9933498 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:25.339786133 +0000 UTC m=+244.448879603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e160702f-e9c0-4c05-96b6-712bc9933498-certificates") pod "keda-admission-cf49989db-d7zsp" (UID: "e160702f-e9c0-4c05-96b6-712bc9933498") : secret "keda-admission-webhooks-certs" not found Apr 24 21:31:24.864560 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:24.864538 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cwft\" (UniqueName: \"kubernetes.io/projected/e160702f-e9c0-4c05-96b6-712bc9933498-kube-api-access-9cwft\") pod \"keda-admission-cf49989db-d7zsp\" (UID: \"e160702f-e9c0-4c05-96b6-712bc9933498\") " pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:25.041356 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:25.041319 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:25.041521 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:25.041474 2581 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:31:25.041521 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:25.041492 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:31:25.041521 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:25.041509 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn: references non-existent secret key: tls.crt Apr 24 21:31:25.041642 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:25.041566 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates podName:e4e605c8-403a-467c-ab81-a53be22351c6 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:26.041547633 +0000 UTC m=+245.150641095 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates") pod "keda-metrics-apiserver-7c9f485588-w67kn" (UID: "e4e605c8-403a-467c-ab81-a53be22351c6") : references non-existent secret key: tls.crt Apr 24 21:31:25.344546 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:25.344502 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e160702f-e9c0-4c05-96b6-712bc9933498-certificates\") pod \"keda-admission-cf49989db-d7zsp\" (UID: \"e160702f-e9c0-4c05-96b6-712bc9933498\") " pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:25.348440 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:25.348412 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e160702f-e9c0-4c05-96b6-712bc9933498-certificates\") pod \"keda-admission-cf49989db-d7zsp\" (UID: \"e160702f-e9c0-4c05-96b6-712bc9933498\") " pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:25.470773 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:25.470723 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:25.546066 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:25.546035 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:25.546196 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:25.546136 2581 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:31:25.546196 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:25.546154 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:31:25.546196 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:25.546163 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-xv6mh: references non-existent secret key: ca.crt Apr 24 21:31:25.546304 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:25.546236 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates podName:1b6280f7-1c9b-460d-88e4-0de2f0be886d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:27.546221379 +0000 UTC m=+246.655314841 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates") pod "keda-operator-ffbb595cb-xv6mh" (UID: "1b6280f7-1c9b-460d-88e4-0de2f0be886d") : references non-existent secret key: ca.crt Apr 24 21:31:25.657535 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:25.657512 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-d7zsp"] Apr 24 21:31:25.659661 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:31:25.659624 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode160702f_e9c0_4c05_96b6_712bc9933498.slice/crio-90353e561e823217da54b0f466dc1f069748113b9cf2ccbb8dcacbf08eb565df WatchSource:0}: Error finding container 90353e561e823217da54b0f466dc1f069748113b9cf2ccbb8dcacbf08eb565df: Status 404 returned error can't find the container with id 90353e561e823217da54b0f466dc1f069748113b9cf2ccbb8dcacbf08eb565df Apr 24 21:31:26.050368 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:26.050342 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:26.050539 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:26.050457 2581 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:31:26.050539 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:26.050471 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:31:26.050539 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:26.050487 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn: references non-existent secret key: tls.crt Apr 24 21:31:26.050643 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:26.050542 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates podName:e4e605c8-403a-467c-ab81-a53be22351c6 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:28.050528334 +0000 UTC m=+247.159621797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates") pod "keda-metrics-apiserver-7c9f485588-w67kn" (UID: "e4e605c8-403a-467c-ab81-a53be22351c6") : references non-existent secret key: tls.crt Apr 24 21:31:26.384920 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:26.382164 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-d7zsp" event={"ID":"e160702f-e9c0-4c05-96b6-712bc9933498","Type":"ContainerStarted","Data":"90353e561e823217da54b0f466dc1f069748113b9cf2ccbb8dcacbf08eb565df"} Apr 24 21:31:27.386822 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:27.386729 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-d7zsp" event={"ID":"e160702f-e9c0-4c05-96b6-712bc9933498","Type":"ContainerStarted","Data":"56cc3e53b806c744003e03cf237493cf9dd0d9c78d0f4d5edfb7250fc845a467"} Apr 24 21:31:27.387206 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:27.386843 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:27.433314 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:27.433268 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-d7zsp" podStartSLOduration=2.040976951 podStartE2EDuration="3.433253347s" podCreationTimestamp="2026-04-24 21:31:24 +0000 UTC" firstStartedPulling="2026-04-24 21:31:25.66076125 +0000 UTC m=+244.769854713" lastFinishedPulling="2026-04-24 21:31:27.053037632 +0000 UTC m=+246.162131109" observedRunningTime="2026-04-24 21:31:27.41699372 +0000 UTC m=+246.526087204" watchObservedRunningTime="2026-04-24 21:31:27.433253347 +0000 UTC m=+246.542346829" Apr 24 21:31:27.562341 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:27.562309 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:27.562508 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:27.562421 2581 secret.go:281] references non-existent secret key: ca.crt Apr 24 21:31:27.562508 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:27.562433 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 21:31:27.562508 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:27.562441 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-xv6mh: references non-existent secret key: ca.crt Apr 24 21:31:27.562508 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:27.562484 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates podName:1b6280f7-1c9b-460d-88e4-0de2f0be886d nodeName:}" failed. No retries permitted until 2026-04-24 21:31:31.562471534 +0000 UTC m=+250.671564998 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates") pod "keda-operator-ffbb595cb-xv6mh" (UID: "1b6280f7-1c9b-460d-88e4-0de2f0be886d") : references non-existent secret key: ca.crt Apr 24 21:31:28.065366 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:28.065319 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:28.065558 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:28.065457 2581 secret.go:281] references non-existent secret key: tls.crt Apr 24 21:31:28.065558 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:28.065478 2581 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 21:31:28.065558 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:28.065496 2581 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn: references non-existent secret key: tls.crt Apr 24 21:31:28.065558 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:31:28.065551 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates podName:e4e605c8-403a-467c-ab81-a53be22351c6 nodeName:}" failed. No retries permitted until 2026-04-24 21:31:32.065535232 +0000 UTC m=+251.174628699 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates") pod "keda-metrics-apiserver-7c9f485588-w67kn" (UID: "e4e605c8-403a-467c-ab81-a53be22351c6") : references non-existent secret key: tls.crt Apr 24 21:31:31.591829 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:31.591798 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:31.594313 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:31.594291 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/1b6280f7-1c9b-460d-88e4-0de2f0be886d-certificates\") pod \"keda-operator-ffbb595cb-xv6mh\" (UID: \"1b6280f7-1c9b-460d-88e4-0de2f0be886d\") " pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:31.647322 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:31.647290 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:31.773667 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:31.773643 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-xv6mh"] Apr 24 21:31:31.776167 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:31:31.776138 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b6280f7_1c9b_460d_88e4_0de2f0be886d.slice/crio-ef5c5c884368dd968b4c2e4d3a342feffc4f3f6beaa0960b08eff7660e57ae61 WatchSource:0}: Error finding container ef5c5c884368dd968b4c2e4d3a342feffc4f3f6beaa0960b08eff7660e57ae61: Status 404 returned error can't find the container with id ef5c5c884368dd968b4c2e4d3a342feffc4f3f6beaa0960b08eff7660e57ae61 Apr 24 21:31:32.094588 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:32.094557 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:32.097095 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:32.097076 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e4e605c8-403a-467c-ab81-a53be22351c6-certificates\") pod \"keda-metrics-apiserver-7c9f485588-w67kn\" (UID: \"e4e605c8-403a-467c-ab81-a53be22351c6\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:32.099880 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:32.099853 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:32.242128 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:32.242090 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn"] Apr 24 21:31:32.243999 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:31:32.243960 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4e605c8_403a_467c_ab81_a53be22351c6.slice/crio-90f3cfc5eca36d1639cacea9527175d5b5a13a73ba5eedf285e52b81b1df9898 WatchSource:0}: Error finding container 90f3cfc5eca36d1639cacea9527175d5b5a13a73ba5eedf285e52b81b1df9898: Status 404 returned error can't find the container with id 90f3cfc5eca36d1639cacea9527175d5b5a13a73ba5eedf285e52b81b1df9898 Apr 24 21:31:32.401260 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:32.401177 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" event={"ID":"1b6280f7-1c9b-460d-88e4-0de2f0be886d","Type":"ContainerStarted","Data":"ef5c5c884368dd968b4c2e4d3a342feffc4f3f6beaa0960b08eff7660e57ae61"} Apr 24 21:31:32.402636 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:32.402607 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" event={"ID":"e4e605c8-403a-467c-ab81-a53be22351c6","Type":"ContainerStarted","Data":"90f3cfc5eca36d1639cacea9527175d5b5a13a73ba5eedf285e52b81b1df9898"} Apr 24 21:31:35.412384 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:35.412353 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" event={"ID":"1b6280f7-1c9b-460d-88e4-0de2f0be886d","Type":"ContainerStarted","Data":"4e67e6a40bb9c6976e5bf55f0b4cd03d267dfe74aca9420d1b5f969315c66752"} Apr 24 21:31:35.413713 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:35.413690 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" event={"ID":"e4e605c8-403a-467c-ab81-a53be22351c6","Type":"ContainerStarted","Data":"509d5bbe0972ad04edb2be58ca04e2b1fe45ed13f3630c9a137c0bab899025b3"} Apr 24 21:31:36.417666 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:36.417630 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:36.434866 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:36.434812 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" podStartSLOduration=9.333105348 podStartE2EDuration="12.434794497s" podCreationTimestamp="2026-04-24 21:31:24 +0000 UTC" firstStartedPulling="2026-04-24 21:31:32.245671903 +0000 UTC m=+251.354765369" lastFinishedPulling="2026-04-24 21:31:35.347361055 +0000 UTC m=+254.456454518" observedRunningTime="2026-04-24 21:31:36.434455048 +0000 UTC m=+255.543548554" watchObservedRunningTime="2026-04-24 21:31:36.434794497 +0000 UTC m=+255.543887984" Apr 24 21:31:36.451523 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:36.451477 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" podStartSLOduration=9.88179246 podStartE2EDuration="13.451462945s" podCreationTimestamp="2026-04-24 21:31:23 +0000 UTC" firstStartedPulling="2026-04-24 21:31:31.77762379 +0000 UTC m=+250.886717257" lastFinishedPulling="2026-04-24 21:31:35.34729428 +0000 UTC m=+254.456387742" observedRunningTime="2026-04-24 21:31:36.450551006 +0000 UTC m=+255.559644513" watchObservedRunningTime="2026-04-24 21:31:36.451462945 +0000 UTC m=+255.560556430" Apr 24 21:31:44.376180 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:44.376150 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-zbl7d" Apr 24 21:31:46.418392 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:46.418355 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:31:47.425402 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:47.425372 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-w67kn" Apr 24 21:31:48.392205 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:48.392167 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-d7zsp" Apr 24 21:31:56.421015 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:31:56.420978 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-xv6mh" Apr 24 21:32:21.414985 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:21.414953 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:32:21.415800 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:21.415781 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:32:21.421641 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:21.421623 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:32:37.092793 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.092758 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-n9gs7"] Apr 24 21:32:37.096040 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.095998 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4"] Apr 24 21:32:37.098882 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.098865 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:32:37.098959 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.098878 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:32:37.101318 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.101288 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 21:32:37.101435 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.101336 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:32:37.101435 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.101417 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:32:37.102383 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.102312 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 24 21:32:37.102491 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.102385 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-jncz7\"" Apr 24 21:32:37.102491 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.102385 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-k587g\"" Apr 24 21:32:37.105131 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.105112 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-n9gs7"] Apr 24 21:32:37.108873 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.108849 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4"] Apr 24 21:32:37.127345 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.127322 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-dwp6q"] Apr 24 21:32:37.130974 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.130954 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dwp6q" Apr 24 21:32:37.133224 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.133205 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:32:37.133339 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.133231 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-dbdtf\"" Apr 24 21:32:37.143687 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.143660 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dwp6q"] Apr 24 21:32:37.280226 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.280192 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b58db69c-b1e7-4fca-95af-227c88662d75-cert\") pod \"kserve-controller-manager-67f77cd7d7-n9gs7\" (UID: \"b58db69c-b1e7-4fca-95af-227c88662d75\") " pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:32:37.280426 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.280240 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99bfde4d-b1e5-40ab-a9de-decfb64464b1-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-bqnv4\" (UID: \"99bfde4d-b1e5-40ab-a9de-decfb64464b1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:32:37.280426 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.280269 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klfnc\" (UniqueName: \"kubernetes.io/projected/99bfde4d-b1e5-40ab-a9de-decfb64464b1-kube-api-access-klfnc\") pod \"llmisvc-controller-manager-68cc5db7c4-bqnv4\" (UID: \"99bfde4d-b1e5-40ab-a9de-decfb64464b1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:32:37.280426 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.280310 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5bls\" (UniqueName: \"kubernetes.io/projected/b58db69c-b1e7-4fca-95af-227c88662d75-kube-api-access-s5bls\") pod \"kserve-controller-manager-67f77cd7d7-n9gs7\" (UID: \"b58db69c-b1e7-4fca-95af-227c88662d75\") " pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:32:37.280426 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.280345 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z88d6\" (UniqueName: \"kubernetes.io/projected/399106c1-f819-42fa-9829-852289131233-kube-api-access-z88d6\") pod \"seaweedfs-86cc847c5c-dwp6q\" (UID: \"399106c1-f819-42fa-9829-852289131233\") " pod="kserve/seaweedfs-86cc847c5c-dwp6q" Apr 24 21:32:37.280426 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.280369 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/399106c1-f819-42fa-9829-852289131233-data\") pod \"seaweedfs-86cc847c5c-dwp6q\" (UID: \"399106c1-f819-42fa-9829-852289131233\") " pod="kserve/seaweedfs-86cc847c5c-dwp6q" Apr 24 21:32:37.381656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.381580 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b58db69c-b1e7-4fca-95af-227c88662d75-cert\") pod \"kserve-controller-manager-67f77cd7d7-n9gs7\" (UID: \"b58db69c-b1e7-4fca-95af-227c88662d75\") " pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:32:37.381656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.381616 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99bfde4d-b1e5-40ab-a9de-decfb64464b1-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-bqnv4\" (UID: \"99bfde4d-b1e5-40ab-a9de-decfb64464b1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:32:37.381656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.381635 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klfnc\" (UniqueName: \"kubernetes.io/projected/99bfde4d-b1e5-40ab-a9de-decfb64464b1-kube-api-access-klfnc\") pod \"llmisvc-controller-manager-68cc5db7c4-bqnv4\" (UID: \"99bfde4d-b1e5-40ab-a9de-decfb64464b1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:32:37.381656 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.381654 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5bls\" (UniqueName: \"kubernetes.io/projected/b58db69c-b1e7-4fca-95af-227c88662d75-kube-api-access-s5bls\") pod \"kserve-controller-manager-67f77cd7d7-n9gs7\" (UID: \"b58db69c-b1e7-4fca-95af-227c88662d75\") " pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:32:37.381953 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.381676 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z88d6\" (UniqueName: \"kubernetes.io/projected/399106c1-f819-42fa-9829-852289131233-kube-api-access-z88d6\") pod \"seaweedfs-86cc847c5c-dwp6q\" (UID: \"399106c1-f819-42fa-9829-852289131233\") " pod="kserve/seaweedfs-86cc847c5c-dwp6q" Apr 24 21:32:37.381953 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.381702 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/399106c1-f819-42fa-9829-852289131233-data\") pod \"seaweedfs-86cc847c5c-dwp6q\" (UID: \"399106c1-f819-42fa-9829-852289131233\") " pod="kserve/seaweedfs-86cc847c5c-dwp6q" Apr 24 21:32:37.381953 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:32:37.381768 2581 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 24 21:32:37.381953 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:32:37.381851 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99bfde4d-b1e5-40ab-a9de-decfb64464b1-cert podName:99bfde4d-b1e5-40ab-a9de-decfb64464b1 nodeName:}" failed. No retries permitted until 2026-04-24 21:32:37.881828437 +0000 UTC m=+316.990921904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/99bfde4d-b1e5-40ab-a9de-decfb64464b1-cert") pod "llmisvc-controller-manager-68cc5db7c4-bqnv4" (UID: "99bfde4d-b1e5-40ab-a9de-decfb64464b1") : secret "llmisvc-webhook-server-cert" not found Apr 24 21:32:37.382170 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.382095 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/399106c1-f819-42fa-9829-852289131233-data\") pod \"seaweedfs-86cc847c5c-dwp6q\" (UID: \"399106c1-f819-42fa-9829-852289131233\") " pod="kserve/seaweedfs-86cc847c5c-dwp6q" Apr 24 21:32:37.384302 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.384280 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b58db69c-b1e7-4fca-95af-227c88662d75-cert\") pod \"kserve-controller-manager-67f77cd7d7-n9gs7\" (UID: \"b58db69c-b1e7-4fca-95af-227c88662d75\") " pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:32:37.395472 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.395433 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klfnc\" (UniqueName: \"kubernetes.io/projected/99bfde4d-b1e5-40ab-a9de-decfb64464b1-kube-api-access-klfnc\") pod \"llmisvc-controller-manager-68cc5db7c4-bqnv4\" (UID: \"99bfde4d-b1e5-40ab-a9de-decfb64464b1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:32:37.395892 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.395875 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z88d6\" (UniqueName: \"kubernetes.io/projected/399106c1-f819-42fa-9829-852289131233-kube-api-access-z88d6\") pod \"seaweedfs-86cc847c5c-dwp6q\" (UID: \"399106c1-f819-42fa-9829-852289131233\") " pod="kserve/seaweedfs-86cc847c5c-dwp6q" Apr 24 21:32:37.395979 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.395963 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5bls\" (UniqueName: \"kubernetes.io/projected/b58db69c-b1e7-4fca-95af-227c88662d75-kube-api-access-s5bls\") pod \"kserve-controller-manager-67f77cd7d7-n9gs7\" (UID: \"b58db69c-b1e7-4fca-95af-227c88662d75\") " pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:32:37.419942 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.419920 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:32:37.440892 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.440867 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-dwp6q" Apr 24 21:32:37.572366 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.572210 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-n9gs7"] Apr 24 21:32:37.574467 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:32:37.574438 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb58db69c_b1e7_4fca_95af_227c88662d75.slice/crio-abd2d24d3cbdbb8eda24a0c3811160e9cb6cad76086a3b576cc1e5d1a338b12c WatchSource:0}: Error finding container abd2d24d3cbdbb8eda24a0c3811160e9cb6cad76086a3b576cc1e5d1a338b12c: Status 404 returned error can't find the container with id abd2d24d3cbdbb8eda24a0c3811160e9cb6cad76086a3b576cc1e5d1a338b12c Apr 24 21:32:37.575715 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.575697 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:32:37.584670 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.584647 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" event={"ID":"b58db69c-b1e7-4fca-95af-227c88662d75","Type":"ContainerStarted","Data":"abd2d24d3cbdbb8eda24a0c3811160e9cb6cad76086a3b576cc1e5d1a338b12c"} Apr 24 21:32:37.602966 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.602940 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-dwp6q"] Apr 24 21:32:37.605697 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:32:37.605675 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod399106c1_f819_42fa_9829_852289131233.slice/crio-70550670aa1d2d177f902d0e8aec131645bf625868ca4b04863d499054b30154 WatchSource:0}: Error finding container 70550670aa1d2d177f902d0e8aec131645bf625868ca4b04863d499054b30154: Status 404 returned error can't find the container with id 70550670aa1d2d177f902d0e8aec131645bf625868ca4b04863d499054b30154 Apr 24 21:32:37.886486 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.886455 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99bfde4d-b1e5-40ab-a9de-decfb64464b1-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-bqnv4\" (UID: \"99bfde4d-b1e5-40ab-a9de-decfb64464b1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:32:37.889015 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:37.888979 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/99bfde4d-b1e5-40ab-a9de-decfb64464b1-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-bqnv4\" (UID: \"99bfde4d-b1e5-40ab-a9de-decfb64464b1\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:32:38.014732 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:38.014641 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:32:38.318506 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:38.318471 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4"] Apr 24 21:32:38.322120 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:32:38.322084 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod99bfde4d_b1e5_40ab_a9de_decfb64464b1.slice/crio-8af19a1e7fbe3ddee9416a201e9d709bf4b253baa32aaf8b0a5b4252d2fb7147 WatchSource:0}: Error finding container 8af19a1e7fbe3ddee9416a201e9d709bf4b253baa32aaf8b0a5b4252d2fb7147: Status 404 returned error can't find the container with id 8af19a1e7fbe3ddee9416a201e9d709bf4b253baa32aaf8b0a5b4252d2fb7147 Apr 24 21:32:38.600484 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:38.600448 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" event={"ID":"99bfde4d-b1e5-40ab-a9de-decfb64464b1","Type":"ContainerStarted","Data":"8af19a1e7fbe3ddee9416a201e9d709bf4b253baa32aaf8b0a5b4252d2fb7147"} Apr 24 21:32:38.603809 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:38.603778 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dwp6q" event={"ID":"399106c1-f819-42fa-9829-852289131233","Type":"ContainerStarted","Data":"70550670aa1d2d177f902d0e8aec131645bf625868ca4b04863d499054b30154"} Apr 24 21:32:42.620147 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:42.620111 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-dwp6q" event={"ID":"399106c1-f819-42fa-9829-852289131233","Type":"ContainerStarted","Data":"b1b9ddeceaf48f06962e4d530967c0e4415a97cba4c371bcf5fdda403af1806d"} Apr 24 21:32:42.620578 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:42.620157 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-dwp6q" Apr 24 21:32:42.621903 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:42.621880 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" event={"ID":"b58db69c-b1e7-4fca-95af-227c88662d75","Type":"ContainerStarted","Data":"f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50"} Apr 24 21:32:42.622060 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:42.622048 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:32:42.637453 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:42.637387 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-dwp6q" podStartSLOduration=1.2233771820000001 podStartE2EDuration="5.637372621s" podCreationTimestamp="2026-04-24 21:32:37 +0000 UTC" firstStartedPulling="2026-04-24 21:32:37.606975023 +0000 UTC m=+316.716068500" lastFinishedPulling="2026-04-24 21:32:42.020970465 +0000 UTC m=+321.130063939" observedRunningTime="2026-04-24 21:32:42.635973593 +0000 UTC m=+321.745067078" watchObservedRunningTime="2026-04-24 21:32:42.637372621 +0000 UTC m=+321.746466106" Apr 24 21:32:42.652146 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:42.652088 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" podStartSLOduration=1.562279514 podStartE2EDuration="5.652067683s" podCreationTimestamp="2026-04-24 21:32:37 +0000 UTC" firstStartedPulling="2026-04-24 21:32:37.575825682 +0000 UTC m=+316.684919145" lastFinishedPulling="2026-04-24 21:32:41.66561385 +0000 UTC m=+320.774707314" observedRunningTime="2026-04-24 21:32:42.651416457 +0000 UTC m=+321.760509942" watchObservedRunningTime="2026-04-24 21:32:42.652067683 +0000 UTC m=+321.761161171" Apr 24 21:32:43.626114 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:43.626067 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" event={"ID":"99bfde4d-b1e5-40ab-a9de-decfb64464b1","Type":"ContainerStarted","Data":"9d3c967f10359bb97fb347a673791ff94605e297af329e36d7b7f36de6c1ce07"} Apr 24 21:32:43.626545 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:43.626380 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:32:43.651368 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:43.651322 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" podStartSLOduration=2.383440561 podStartE2EDuration="6.651307401s" podCreationTimestamp="2026-04-24 21:32:37 +0000 UTC" firstStartedPulling="2026-04-24 21:32:38.323964064 +0000 UTC m=+317.433057539" lastFinishedPulling="2026-04-24 21:32:42.591830912 +0000 UTC m=+321.700924379" observedRunningTime="2026-04-24 21:32:43.649913531 +0000 UTC m=+322.759007039" watchObservedRunningTime="2026-04-24 21:32:43.651307401 +0000 UTC m=+322.760400886" Apr 24 21:32:48.628678 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:32:48.628649 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-dwp6q" Apr 24 21:33:13.631455 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:13.631382 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:33:14.631652 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:14.631614 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-bqnv4" Apr 24 21:33:16.159706 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.159672 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-n9gs7"] Apr 24 21:33:16.160104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.159914 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" podUID="b58db69c-b1e7-4fca-95af-227c88662d75" containerName="manager" containerID="cri-o://f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50" gracePeriod=10 Apr 24 21:33:16.188393 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.188362 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-2fxsv"] Apr 24 21:33:16.255104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.255079 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-2fxsv"] Apr 24 21:33:16.255235 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.255223 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" Apr 24 21:33:16.383345 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.383309 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mtgv\" (UniqueName: \"kubernetes.io/projected/57df7126-9e58-4277-9d60-5589e17b0901-kube-api-access-4mtgv\") pod \"kserve-controller-manager-67f77cd7d7-2fxsv\" (UID: \"57df7126-9e58-4277-9d60-5589e17b0901\") " pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" Apr 24 21:33:16.383345 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.383360 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57df7126-9e58-4277-9d60-5589e17b0901-cert\") pod \"kserve-controller-manager-67f77cd7d7-2fxsv\" (UID: \"57df7126-9e58-4277-9d60-5589e17b0901\") " pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" Apr 24 21:33:16.417327 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.417262 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:33:16.484664 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.484633 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b58db69c-b1e7-4fca-95af-227c88662d75-cert\") pod \"b58db69c-b1e7-4fca-95af-227c88662d75\" (UID: \"b58db69c-b1e7-4fca-95af-227c88662d75\") " Apr 24 21:33:16.484838 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.484691 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5bls\" (UniqueName: \"kubernetes.io/projected/b58db69c-b1e7-4fca-95af-227c88662d75-kube-api-access-s5bls\") pod \"b58db69c-b1e7-4fca-95af-227c88662d75\" (UID: \"b58db69c-b1e7-4fca-95af-227c88662d75\") " Apr 24 21:33:16.484890 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.484857 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mtgv\" (UniqueName: \"kubernetes.io/projected/57df7126-9e58-4277-9d60-5589e17b0901-kube-api-access-4mtgv\") pod \"kserve-controller-manager-67f77cd7d7-2fxsv\" (UID: \"57df7126-9e58-4277-9d60-5589e17b0901\") " pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" Apr 24 21:33:16.484946 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.484896 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57df7126-9e58-4277-9d60-5589e17b0901-cert\") pod \"kserve-controller-manager-67f77cd7d7-2fxsv\" (UID: \"57df7126-9e58-4277-9d60-5589e17b0901\") " pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" Apr 24 21:33:16.487089 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.487057 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58db69c-b1e7-4fca-95af-227c88662d75-kube-api-access-s5bls" (OuterVolumeSpecName: "kube-api-access-s5bls") pod "b58db69c-b1e7-4fca-95af-227c88662d75" (UID: "b58db69c-b1e7-4fca-95af-227c88662d75"). InnerVolumeSpecName "kube-api-access-s5bls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:33:16.487205 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.487113 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58db69c-b1e7-4fca-95af-227c88662d75-cert" (OuterVolumeSpecName: "cert") pod "b58db69c-b1e7-4fca-95af-227c88662d75" (UID: "b58db69c-b1e7-4fca-95af-227c88662d75"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:33:16.487434 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.487414 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57df7126-9e58-4277-9d60-5589e17b0901-cert\") pod \"kserve-controller-manager-67f77cd7d7-2fxsv\" (UID: \"57df7126-9e58-4277-9d60-5589e17b0901\") " pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" Apr 24 21:33:16.503999 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.503971 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mtgv\" (UniqueName: \"kubernetes.io/projected/57df7126-9e58-4277-9d60-5589e17b0901-kube-api-access-4mtgv\") pod \"kserve-controller-manager-67f77cd7d7-2fxsv\" (UID: \"57df7126-9e58-4277-9d60-5589e17b0901\") " pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" Apr 24 21:33:16.585410 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.585376 2581 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b58db69c-b1e7-4fca-95af-227c88662d75-cert\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:33:16.585410 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.585407 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5bls\" (UniqueName: \"kubernetes.io/projected/b58db69c-b1e7-4fca-95af-227c88662d75-kube-api-access-s5bls\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:33:16.639539 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.639504 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" Apr 24 21:33:16.726856 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.726824 2581 generic.go:358] "Generic (PLEG): container finished" podID="b58db69c-b1e7-4fca-95af-227c88662d75" containerID="f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50" exitCode=0 Apr 24 21:33:16.727005 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.726880 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" event={"ID":"b58db69c-b1e7-4fca-95af-227c88662d75","Type":"ContainerDied","Data":"f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50"} Apr 24 21:33:16.727005 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.726885 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" Apr 24 21:33:16.727005 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.726905 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-n9gs7" event={"ID":"b58db69c-b1e7-4fca-95af-227c88662d75","Type":"ContainerDied","Data":"abd2d24d3cbdbb8eda24a0c3811160e9cb6cad76086a3b576cc1e5d1a338b12c"} Apr 24 21:33:16.727005 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.726926 2581 scope.go:117] "RemoveContainer" containerID="f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50" Apr 24 21:33:16.737091 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.737072 2581 scope.go:117] "RemoveContainer" containerID="f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50" Apr 24 21:33:16.737404 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:33:16.737386 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50\": container with ID starting with f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50 not found: ID does not exist" containerID="f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50" Apr 24 21:33:16.737480 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.737413 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50"} err="failed to get container status \"f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50\": rpc error: code = NotFound desc = could not find container \"f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50\": container with ID starting with f64fa372c1835b1a25974fdbe8bcbf36a8e37394501d5d76c078890566b8dd50 not found: ID does not exist" Apr 24 21:33:16.760081 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.760051 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-n9gs7"] Apr 24 21:33:16.774990 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.774958 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-n9gs7"] Apr 24 21:33:16.787970 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:16.787947 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-67f77cd7d7-2fxsv"] Apr 24 21:33:16.791438 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:33:16.791404 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57df7126_9e58_4277_9d60_5589e17b0901.slice/crio-b678bba56130d13045829f5fd5a87c0b01f95178b5b27414236a29d486dc96bd WatchSource:0}: Error finding container b678bba56130d13045829f5fd5a87c0b01f95178b5b27414236a29d486dc96bd: Status 404 returned error can't find the container with id b678bba56130d13045829f5fd5a87c0b01f95178b5b27414236a29d486dc96bd Apr 24 21:33:17.523038 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:17.522941 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58db69c-b1e7-4fca-95af-227c88662d75" path="/var/lib/kubelet/pods/b58db69c-b1e7-4fca-95af-227c88662d75/volumes" Apr 24 21:33:17.731715 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:17.731679 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" event={"ID":"57df7126-9e58-4277-9d60-5589e17b0901","Type":"ContainerStarted","Data":"b3a91bda6c0ea67e1f666e5b8699336fa488bd8eb3129c70093a9257df3e8b04"} Apr 24 21:33:17.731715 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:17.731720 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" event={"ID":"57df7126-9e58-4277-9d60-5589e17b0901","Type":"ContainerStarted","Data":"b678bba56130d13045829f5fd5a87c0b01f95178b5b27414236a29d486dc96bd"} Apr 24 21:33:17.732172 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:17.731766 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" Apr 24 21:33:17.754673 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:17.754626 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" podStartSLOduration=1.397746289 podStartE2EDuration="1.754612343s" podCreationTimestamp="2026-04-24 21:33:16 +0000 UTC" firstStartedPulling="2026-04-24 21:33:16.792613449 +0000 UTC m=+355.901706916" lastFinishedPulling="2026-04-24 21:33:17.149479503 +0000 UTC m=+356.258572970" observedRunningTime="2026-04-24 21:33:17.753532757 +0000 UTC m=+356.862626243" watchObservedRunningTime="2026-04-24 21:33:17.754612343 +0000 UTC m=+356.863705827" Apr 24 21:33:48.740955 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:48.740920 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-67f77cd7d7-2fxsv" Apr 24 21:33:49.594175 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.594145 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-cv8gm"] Apr 24 21:33:49.594479 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.594466 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b58db69c-b1e7-4fca-95af-227c88662d75" containerName="manager" Apr 24 21:33:49.594544 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.594480 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58db69c-b1e7-4fca-95af-227c88662d75" containerName="manager" Apr 24 21:33:49.594597 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.594546 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b58db69c-b1e7-4fca-95af-227c88662d75" containerName="manager" Apr 24 21:33:49.597532 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.597514 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-cv8gm" Apr 24 21:33:49.600000 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.599969 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 21:33:49.600000 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.599975 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-2dz5p\"" Apr 24 21:33:49.607622 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.607597 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-7twt6"] Apr 24 21:33:49.611205 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.611183 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-cv8gm"] Apr 24 21:33:49.611331 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.611291 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:33:49.613661 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.613637 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 21:33:49.613799 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.613781 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-zlbg8\"" Apr 24 21:33:49.621245 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.621215 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7twt6"] Apr 24 21:33:49.746615 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.746576 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c29421-d9fc-45e4-8f9c-91ff424e0229-cert\") pod \"odh-model-controller-696fc77849-7twt6\" (UID: \"f9c29421-d9fc-45e4-8f9c-91ff424e0229\") " pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:33:49.746985 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.746625 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr56h\" (UniqueName: \"kubernetes.io/projected/f9c29421-d9fc-45e4-8f9c-91ff424e0229-kube-api-access-qr56h\") pod \"odh-model-controller-696fc77849-7twt6\" (UID: \"f9c29421-d9fc-45e4-8f9c-91ff424e0229\") " pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:33:49.746985 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.746651 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed14a9c-5fc2-4bff-9f41-8ac3ab230973-tls-certs\") pod \"model-serving-api-86f7b4b499-cv8gm\" (UID: \"2ed14a9c-5fc2-4bff-9f41-8ac3ab230973\") " pod="kserve/model-serving-api-86f7b4b499-cv8gm" Apr 24 21:33:49.746985 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.746735 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rlfv\" (UniqueName: \"kubernetes.io/projected/2ed14a9c-5fc2-4bff-9f41-8ac3ab230973-kube-api-access-8rlfv\") pod \"model-serving-api-86f7b4b499-cv8gm\" (UID: \"2ed14a9c-5fc2-4bff-9f41-8ac3ab230973\") " pod="kserve/model-serving-api-86f7b4b499-cv8gm" Apr 24 21:33:49.848122 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.848034 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rlfv\" (UniqueName: \"kubernetes.io/projected/2ed14a9c-5fc2-4bff-9f41-8ac3ab230973-kube-api-access-8rlfv\") pod \"model-serving-api-86f7b4b499-cv8gm\" (UID: \"2ed14a9c-5fc2-4bff-9f41-8ac3ab230973\") " pod="kserve/model-serving-api-86f7b4b499-cv8gm" Apr 24 21:33:49.848122 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.848122 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c29421-d9fc-45e4-8f9c-91ff424e0229-cert\") pod \"odh-model-controller-696fc77849-7twt6\" (UID: \"f9c29421-d9fc-45e4-8f9c-91ff424e0229\") " pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:33:49.848352 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.848152 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qr56h\" (UniqueName: \"kubernetes.io/projected/f9c29421-d9fc-45e4-8f9c-91ff424e0229-kube-api-access-qr56h\") pod \"odh-model-controller-696fc77849-7twt6\" (UID: \"f9c29421-d9fc-45e4-8f9c-91ff424e0229\") " pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:33:49.848352 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.848185 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed14a9c-5fc2-4bff-9f41-8ac3ab230973-tls-certs\") pod \"model-serving-api-86f7b4b499-cv8gm\" (UID: \"2ed14a9c-5fc2-4bff-9f41-8ac3ab230973\") " pod="kserve/model-serving-api-86f7b4b499-cv8gm" Apr 24 21:33:49.848352 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:33:49.848284 2581 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 24 21:33:49.848506 ip-10-0-130-31 kubenswrapper[2581]: E0424 21:33:49.848363 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f9c29421-d9fc-45e4-8f9c-91ff424e0229-cert podName:f9c29421-d9fc-45e4-8f9c-91ff424e0229 nodeName:}" failed. No retries permitted until 2026-04-24 21:33:50.34833684 +0000 UTC m=+389.457430306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f9c29421-d9fc-45e4-8f9c-91ff424e0229-cert") pod "odh-model-controller-696fc77849-7twt6" (UID: "f9c29421-d9fc-45e4-8f9c-91ff424e0229") : secret "odh-model-controller-webhook-cert" not found Apr 24 21:33:49.850929 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.850892 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ed14a9c-5fc2-4bff-9f41-8ac3ab230973-tls-certs\") pod \"model-serving-api-86f7b4b499-cv8gm\" (UID: \"2ed14a9c-5fc2-4bff-9f41-8ac3ab230973\") " pod="kserve/model-serving-api-86f7b4b499-cv8gm" Apr 24 21:33:49.858792 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.858765 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr56h\" (UniqueName: \"kubernetes.io/projected/f9c29421-d9fc-45e4-8f9c-91ff424e0229-kube-api-access-qr56h\") pod \"odh-model-controller-696fc77849-7twt6\" (UID: \"f9c29421-d9fc-45e4-8f9c-91ff424e0229\") " pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:33:49.861406 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.861380 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rlfv\" (UniqueName: \"kubernetes.io/projected/2ed14a9c-5fc2-4bff-9f41-8ac3ab230973-kube-api-access-8rlfv\") pod \"model-serving-api-86f7b4b499-cv8gm\" (UID: \"2ed14a9c-5fc2-4bff-9f41-8ac3ab230973\") " pod="kserve/model-serving-api-86f7b4b499-cv8gm" Apr 24 21:33:49.909864 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:49.909829 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-cv8gm" Apr 24 21:33:50.041797 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:50.041764 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-cv8gm"] Apr 24 21:33:50.045627 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:33:50.045597 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ed14a9c_5fc2_4bff_9f41_8ac3ab230973.slice/crio-384f9ce5c937e94d7e01d236ab92911dd5b28e46843853d4502856e8f94d25b9 WatchSource:0}: Error finding container 384f9ce5c937e94d7e01d236ab92911dd5b28e46843853d4502856e8f94d25b9: Status 404 returned error can't find the container with id 384f9ce5c937e94d7e01d236ab92911dd5b28e46843853d4502856e8f94d25b9 Apr 24 21:33:50.352969 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:50.352934 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c29421-d9fc-45e4-8f9c-91ff424e0229-cert\") pod \"odh-model-controller-696fc77849-7twt6\" (UID: \"f9c29421-d9fc-45e4-8f9c-91ff424e0229\") " pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:33:50.355486 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:50.355467 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f9c29421-d9fc-45e4-8f9c-91ff424e0229-cert\") pod \"odh-model-controller-696fc77849-7twt6\" (UID: \"f9c29421-d9fc-45e4-8f9c-91ff424e0229\") " pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:33:50.524751 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:50.524715 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:33:50.701089 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:50.700906 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7twt6"] Apr 24 21:33:50.746033 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:33:50.745991 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c29421_d9fc_45e4_8f9c_91ff424e0229.slice/crio-5b21dbf1e29412e3193dced62a5bee37bc582b080bf0c450673f4c316f034952 WatchSource:0}: Error finding container 5b21dbf1e29412e3193dced62a5bee37bc582b080bf0c450673f4c316f034952: Status 404 returned error can't find the container with id 5b21dbf1e29412e3193dced62a5bee37bc582b080bf0c450673f4c316f034952 Apr 24 21:33:50.844575 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:50.844536 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7twt6" event={"ID":"f9c29421-d9fc-45e4-8f9c-91ff424e0229","Type":"ContainerStarted","Data":"5b21dbf1e29412e3193dced62a5bee37bc582b080bf0c450673f4c316f034952"} Apr 24 21:33:50.845695 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:50.845669 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-cv8gm" event={"ID":"2ed14a9c-5fc2-4bff-9f41-8ac3ab230973","Type":"ContainerStarted","Data":"384f9ce5c937e94d7e01d236ab92911dd5b28e46843853d4502856e8f94d25b9"} Apr 24 21:33:51.852264 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:51.851988 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-cv8gm" event={"ID":"2ed14a9c-5fc2-4bff-9f41-8ac3ab230973","Type":"ContainerStarted","Data":"4c762b008f004c6c9c46806496f04821e913daf698796ed590aa6961b6e1eb1d"} Apr 24 21:33:51.852264 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:51.852160 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-cv8gm" Apr 24 21:33:51.871650 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:51.871582 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-cv8gm" podStartSLOduration=1.678207727 podStartE2EDuration="2.871560745s" podCreationTimestamp="2026-04-24 21:33:49 +0000 UTC" firstStartedPulling="2026-04-24 21:33:50.047395075 +0000 UTC m=+389.156488538" lastFinishedPulling="2026-04-24 21:33:51.240748075 +0000 UTC m=+390.349841556" observedRunningTime="2026-04-24 21:33:51.86945853 +0000 UTC m=+390.978552016" watchObservedRunningTime="2026-04-24 21:33:51.871560745 +0000 UTC m=+390.980654231" Apr 24 21:33:53.860866 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:53.860830 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7twt6" event={"ID":"f9c29421-d9fc-45e4-8f9c-91ff424e0229","Type":"ContainerStarted","Data":"81e1a373b8030b7228269b3ddce9c5849e3f1d4e8ec7cec2b7e54a08335e3156"} Apr 24 21:33:53.861329 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:53.860965 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:33:53.879665 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:33:53.879608 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-7twt6" podStartSLOduration=2.309645085 podStartE2EDuration="4.879587517s" podCreationTimestamp="2026-04-24 21:33:49 +0000 UTC" firstStartedPulling="2026-04-24 21:33:50.747742524 +0000 UTC m=+389.856836000" lastFinishedPulling="2026-04-24 21:33:53.317684965 +0000 UTC m=+392.426778432" observedRunningTime="2026-04-24 21:33:53.87921447 +0000 UTC m=+392.988307957" watchObservedRunningTime="2026-04-24 21:33:53.879587517 +0000 UTC m=+392.988681002" Apr 24 21:34:02.861936 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:02.861904 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-cv8gm" Apr 24 21:34:04.866378 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:04.866352 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-7twt6" Apr 24 21:34:26.198746 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.198709 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw"] Apr 24 21:34:26.206265 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.206239 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.209818 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.209790 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-predictor-serving-cert\"" Apr 24 21:34:26.209818 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.209800 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:34:26.209999 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.209791 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:34:26.209999 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.209791 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\"" Apr 24 21:34:26.209999 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.209793 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-f87qg\"" Apr 24 21:34:26.215226 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.215204 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw"] Apr 24 21:34:26.337346 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.337292 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/721f5134-e6d0-43ca-865b-ae193260f5a3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.337532 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.337383 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/721f5134-e6d0-43ca-865b-ae193260f5a3-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.337532 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.337459 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579df\" (UniqueName: \"kubernetes.io/projected/721f5134-e6d0-43ca-865b-ae193260f5a3-kube-api-access-579df\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.337645 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.337541 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/721f5134-e6d0-43ca-865b-ae193260f5a3-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.438439 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.438404 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/721f5134-e6d0-43ca-865b-ae193260f5a3-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.438616 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.438474 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/721f5134-e6d0-43ca-865b-ae193260f5a3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.438616 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.438503 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/721f5134-e6d0-43ca-865b-ae193260f5a3-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.438616 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.438521 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-579df\" (UniqueName: \"kubernetes.io/projected/721f5134-e6d0-43ca-865b-ae193260f5a3-kube-api-access-579df\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.438948 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.438925 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/721f5134-e6d0-43ca-865b-ae193260f5a3-kserve-provision-location\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.439280 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.439257 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/721f5134-e6d0-43ca-865b-ae193260f5a3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.441117 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.441099 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/721f5134-e6d0-43ca-865b-ae193260f5a3-proxy-tls\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.447918 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.447892 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-579df\" (UniqueName: \"kubernetes.io/projected/721f5134-e6d0-43ca-865b-ae193260f5a3-kube-api-access-579df\") pod \"isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.517660 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.517569 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:26.666477 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.666399 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw"] Apr 24 21:34:26.669132 ip-10-0-130-31 kubenswrapper[2581]: W0424 21:34:26.669097 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod721f5134_e6d0_43ca_865b_ae193260f5a3.slice/crio-46531cb08a78b7f8e801ff70f9974e74c1dfa434f4aac43ea170fa69169fa6f7 WatchSource:0}: Error finding container 46531cb08a78b7f8e801ff70f9974e74c1dfa434f4aac43ea170fa69169fa6f7: Status 404 returned error can't find the container with id 46531cb08a78b7f8e801ff70f9974e74c1dfa434f4aac43ea170fa69169fa6f7 Apr 24 21:34:26.966421 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:26.966385 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" event={"ID":"721f5134-e6d0-43ca-865b-ae193260f5a3","Type":"ContainerStarted","Data":"46531cb08a78b7f8e801ff70f9974e74c1dfa434f4aac43ea170fa69169fa6f7"} Apr 24 21:34:30.982577 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:30.982536 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" event={"ID":"721f5134-e6d0-43ca-865b-ae193260f5a3","Type":"ContainerStarted","Data":"1fd6dbc0299a836fb0278b317f6bedb908357a31365d7d9765daa9b78cdd686f"} Apr 24 21:34:33.994041 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:33.993925 2581 generic.go:358] "Generic (PLEG): container finished" podID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerID="1fd6dbc0299a836fb0278b317f6bedb908357a31365d7d9765daa9b78cdd686f" exitCode=0 Apr 24 21:34:33.994041 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:33.994003 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" event={"ID":"721f5134-e6d0-43ca-865b-ae193260f5a3","Type":"ContainerDied","Data":"1fd6dbc0299a836fb0278b317f6bedb908357a31365d7d9765daa9b78cdd686f"} Apr 24 21:34:48.050702 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:48.050650 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" event={"ID":"721f5134-e6d0-43ca-865b-ae193260f5a3","Type":"ContainerStarted","Data":"78a1293d71b0f6362161dfe6ebde1806f7fddcddade40fac94423156cc645675"} Apr 24 21:34:50.059726 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:50.059683 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" event={"ID":"721f5134-e6d0-43ca-865b-ae193260f5a3","Type":"ContainerStarted","Data":"7f3a59cbf8310f90c7a5085e758fb7d0e58deb0ecdd48df478ae46f90bd3fec6"} Apr 24 21:34:50.060222 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:50.059853 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:50.081233 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:50.081181 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podStartSLOduration=1.095780353 podStartE2EDuration="24.081165275s" podCreationTimestamp="2026-04-24 21:34:26 +0000 UTC" firstStartedPulling="2026-04-24 21:34:26.670955618 +0000 UTC m=+425.780049094" lastFinishedPulling="2026-04-24 21:34:49.656340538 +0000 UTC m=+448.765434016" observedRunningTime="2026-04-24 21:34:50.079512777 +0000 UTC m=+449.188606274" watchObservedRunningTime="2026-04-24 21:34:50.081165275 +0000 UTC m=+449.190258760" Apr 24 21:34:51.062861 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:51.062833 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:51.064198 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:51.064149 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:34:52.065387 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:52.065332 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:34:57.069625 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:57.069594 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:34:57.070215 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:34:57.070188 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:35:07.070864 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:35:07.070823 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:35:17.071166 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:35:17.071115 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:35:27.070664 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:35:27.070623 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:35:37.070597 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:35:37.070558 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:35:47.070580 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:35:47.070545 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:35:57.071520 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:35:57.071489 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:36:35.292585 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:35.292552 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw"] Apr 24 21:36:35.293128 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:35.292867 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" containerID="cri-o://78a1293d71b0f6362161dfe6ebde1806f7fddcddade40fac94423156cc645675" gracePeriod=30 Apr 24 21:36:35.293128 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:35.292940 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kube-rbac-proxy" containerID="cri-o://7f3a59cbf8310f90c7a5085e758fb7d0e58deb0ecdd48df478ae46f90bd3fec6" gracePeriod=30 Apr 24 21:36:36.397635 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:36.397598 2581 generic.go:358] "Generic (PLEG): container finished" podID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerID="7f3a59cbf8310f90c7a5085e758fb7d0e58deb0ecdd48df478ae46f90bd3fec6" exitCode=2 Apr 24 21:36:36.398009 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:36.397672 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" event={"ID":"721f5134-e6d0-43ca-865b-ae193260f5a3","Type":"ContainerDied","Data":"7f3a59cbf8310f90c7a5085e758fb7d0e58deb0ecdd48df478ae46f90bd3fec6"} Apr 24 21:36:37.066639 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:37.066590 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.27:8643/healthz\": dial tcp 10.133.0.27:8643: connect: connection refused" Apr 24 21:36:37.070954 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:37.070923 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 24 21:36:39.409742 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.409706 2581 generic.go:358] "Generic (PLEG): container finished" podID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerID="78a1293d71b0f6362161dfe6ebde1806f7fddcddade40fac94423156cc645675" exitCode=0 Apr 24 21:36:39.410104 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.409776 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" event={"ID":"721f5134-e6d0-43ca-865b-ae193260f5a3","Type":"ContainerDied","Data":"78a1293d71b0f6362161dfe6ebde1806f7fddcddade40fac94423156cc645675"} Apr 24 21:36:39.435920 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.435902 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:36:39.508162 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.508132 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-579df\" (UniqueName: \"kubernetes.io/projected/721f5134-e6d0-43ca-865b-ae193260f5a3-kube-api-access-579df\") pod \"721f5134-e6d0-43ca-865b-ae193260f5a3\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " Apr 24 21:36:39.508298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.508179 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/721f5134-e6d0-43ca-865b-ae193260f5a3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") pod \"721f5134-e6d0-43ca-865b-ae193260f5a3\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " Apr 24 21:36:39.508298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.508201 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/721f5134-e6d0-43ca-865b-ae193260f5a3-proxy-tls\") pod \"721f5134-e6d0-43ca-865b-ae193260f5a3\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " Apr 24 21:36:39.508298 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.508243 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/721f5134-e6d0-43ca-865b-ae193260f5a3-kserve-provision-location\") pod \"721f5134-e6d0-43ca-865b-ae193260f5a3\" (UID: \"721f5134-e6d0-43ca-865b-ae193260f5a3\") " Apr 24 21:36:39.508615 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.508588 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/721f5134-e6d0-43ca-865b-ae193260f5a3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config") pod "721f5134-e6d0-43ca-865b-ae193260f5a3" (UID: "721f5134-e6d0-43ca-865b-ae193260f5a3"). InnerVolumeSpecName "isvc-sklearn-graph-2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:36:39.508683 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.508618 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/721f5134-e6d0-43ca-865b-ae193260f5a3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "721f5134-e6d0-43ca-865b-ae193260f5a3" (UID: "721f5134-e6d0-43ca-865b-ae193260f5a3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:36:39.510535 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.510510 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/721f5134-e6d0-43ca-865b-ae193260f5a3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "721f5134-e6d0-43ca-865b-ae193260f5a3" (UID: "721f5134-e6d0-43ca-865b-ae193260f5a3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:36:39.510630 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.510510 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/721f5134-e6d0-43ca-865b-ae193260f5a3-kube-api-access-579df" (OuterVolumeSpecName: "kube-api-access-579df") pod "721f5134-e6d0-43ca-865b-ae193260f5a3" (UID: "721f5134-e6d0-43ca-865b-ae193260f5a3"). InnerVolumeSpecName "kube-api-access-579df". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:36:39.609568 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.609496 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-579df\" (UniqueName: \"kubernetes.io/projected/721f5134-e6d0-43ca-865b-ae193260f5a3-kube-api-access-579df\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:36:39.609568 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.609525 2581 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/721f5134-e6d0-43ca-865b-ae193260f5a3-isvc-sklearn-graph-2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:36:39.609568 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.609536 2581 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/721f5134-e6d0-43ca-865b-ae193260f5a3-proxy-tls\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:36:39.609568 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:39.609547 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/721f5134-e6d0-43ca-865b-ae193260f5a3-kserve-provision-location\") on node \"ip-10-0-130-31.ec2.internal\" DevicePath \"\"" Apr 24 21:36:40.414943 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:40.414917 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" Apr 24 21:36:40.415367 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:40.414912 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw" event={"ID":"721f5134-e6d0-43ca-865b-ae193260f5a3","Type":"ContainerDied","Data":"46531cb08a78b7f8e801ff70f9974e74c1dfa434f4aac43ea170fa69169fa6f7"} Apr 24 21:36:40.415367 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:40.415063 2581 scope.go:117] "RemoveContainer" containerID="7f3a59cbf8310f90c7a5085e758fb7d0e58deb0ecdd48df478ae46f90bd3fec6" Apr 24 21:36:40.423178 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:40.423162 2581 scope.go:117] "RemoveContainer" containerID="78a1293d71b0f6362161dfe6ebde1806f7fddcddade40fac94423156cc645675" Apr 24 21:36:40.430243 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:40.430225 2581 scope.go:117] "RemoveContainer" containerID="1fd6dbc0299a836fb0278b317f6bedb908357a31365d7d9765daa9b78cdd686f" Apr 24 21:36:40.430993 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:40.430973 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw"] Apr 24 21:36:40.435120 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:40.435096 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-2-predictor-58cc99f6ff-jrwtw"] Apr 24 21:36:41.522932 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:36:41.522897 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" path="/var/lib/kubelet/pods/721f5134-e6d0-43ca-865b-ae193260f5a3/volumes" Apr 24 21:37:21.435881 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:37:21.435855 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:37:21.436804 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:37:21.436786 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:42:21.456090 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:42:21.455995 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:42:21.461450 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:42:21.461430 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:47:21.480775 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:47:21.480748 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:47:21.485773 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:47:21.485753 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:52:21.502561 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:52:21.502526 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:52:21.507840 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:52:21.507818 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:57:21.522476 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:57:21.522368 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 21:57:21.528844 ip-10-0-130-31 kubenswrapper[2581]: I0424 21:57:21.528826 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 22:02:21.542453 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:02:21.542423 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 22:02:21.550251 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:02:21.550229 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 22:07:21.563087 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:07:21.562948 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 22:07:21.573770 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:07:21.573749 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 22:12:21.586240 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:12:21.586133 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 22:12:21.594358 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:12:21.594339 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 22:14:05.354123 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:05.354096 2581 ???:1] "http: TLS handshake error from 10.0.132.124:49264: EOF" Apr 24 22:14:05.362546 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:05.362524 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rklj9_151371d6-8756-495b-8181-7fdcb156d1f4/global-pull-secret-syncer/0.log" Apr 24 22:14:05.517421 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:05.517399 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h4z9c_61331cbf-bfdf-44cd-895b-21d09c03e3a3/konnectivity-agent/0.log" Apr 24 22:14:05.554135 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:05.554095 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-31.ec2.internal_245bce5339b00b6e9cfc0086658d8fb7/haproxy/0.log" Apr 24 22:14:09.444990 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:09.444959 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pj7mx_9159faba-fd40-4471-843c-488aef676c4e/node-exporter/0.log" Apr 24 22:14:09.474385 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:09.474361 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pj7mx_9159faba-fd40-4471-843c-488aef676c4e/kube-rbac-proxy/0.log" Apr 24 22:14:09.503437 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:09.503416 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pj7mx_9159faba-fd40-4471-843c-488aef676c4e/init-textfile/0.log" Apr 24 22:14:09.882396 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:09.882361 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-bcppr_fd73b28b-f395-46a8-b132-56d62ce821db/prometheus-operator/0.log" Apr 24 22:14:09.899298 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:09.899275 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-bcppr_fd73b28b-f395-46a8-b132-56d62ce821db/kube-rbac-proxy/0.log" Apr 24 22:14:09.925932 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:09.925898 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-qxfjs_a4a2f809-339a-4381-826d-07e74cd2ec89/prometheus-operator-admission-webhook/0.log" Apr 24 22:14:11.993557 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:11.993528 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-tvnrx_6a569780-c1f3-462e-87dd-d4b03fe11d70/download-server/0.log" Apr 24 22:14:12.389854 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.389821 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc"] Apr 24 22:14:12.390141 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.390129 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="storage-initializer" Apr 24 22:14:12.390188 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.390144 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="storage-initializer" Apr 24 22:14:12.390188 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.390155 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" Apr 24 22:14:12.390188 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.390160 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" Apr 24 22:14:12.390188 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.390168 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kube-rbac-proxy" Apr 24 22:14:12.390188 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.390173 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kube-rbac-proxy" Apr 24 22:14:12.390329 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.390231 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kube-rbac-proxy" Apr 24 22:14:12.390329 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.390237 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="721f5134-e6d0-43ca-865b-ae193260f5a3" containerName="kserve-container" Apr 24 22:14:12.392830 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.392811 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.394972 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.394953 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ltmfk\"/\"openshift-service-ca.crt\"" Apr 24 22:14:12.395103 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.394980 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-ltmfk\"/\"default-dockercfg-9d9fp\"" Apr 24 22:14:12.395828 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.395811 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-ltmfk\"/\"kube-root-ca.crt\"" Apr 24 22:14:12.402602 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.402581 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc"] Apr 24 22:14:12.566164 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.566133 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-lib-modules\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.566307 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.566175 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-podres\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.566307 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.566199 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxg6s\" (UniqueName: \"kubernetes.io/projected/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-kube-api-access-qxg6s\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.566307 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.566249 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-sys\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.566307 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.566305 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-proc\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.667069 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.666983 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-sys\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.667069 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.667048 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-proc\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.667240 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.667097 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-sys\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.667240 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.667109 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-lib-modules\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.667240 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.667160 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-proc\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.667240 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.667169 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-podres\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.667240 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.667201 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxg6s\" (UniqueName: \"kubernetes.io/projected/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-kube-api-access-qxg6s\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.667496 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.667262 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-podres\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.667496 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.667309 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-lib-modules\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.675010 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.674988 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxg6s\" (UniqueName: \"kubernetes.io/projected/93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c-kube-api-access-qxg6s\") pod \"perf-node-gather-daemonset-tbtnc\" (UID: \"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c\") " pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.702978 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.702951 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:12.825234 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.825209 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc"] Apr 24 22:14:12.827425 ip-10-0-130-31 kubenswrapper[2581]: W0424 22:14:12.827389 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod93f4d196_25f2_47a4_8c0e_e3a2b6b2f41c.slice/crio-17a94963341c5618c05e99efe09e175ccf03e7435f26215943c818a0439dc251 WatchSource:0}: Error finding container 17a94963341c5618c05e99efe09e175ccf03e7435f26215943c818a0439dc251: Status 404 returned error can't find the container with id 17a94963341c5618c05e99efe09e175ccf03e7435f26215943c818a0439dc251 Apr 24 22:14:12.829080 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:12.829064 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:14:13.167573 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:13.167544 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sdjvb_57bed6cb-14ec-45db-98ca-49d0e2a82730/dns/0.log" Apr 24 22:14:13.187212 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:13.187186 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-sdjvb_57bed6cb-14ec-45db-98ca-49d0e2a82730/kube-rbac-proxy/0.log" Apr 24 22:14:13.230098 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:13.230071 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hjwlf_964dbac9-11de-44a8-b2ea-152ca4914413/dns-node-resolver/0.log" Apr 24 22:14:13.367333 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:13.367301 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" event={"ID":"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c","Type":"ContainerStarted","Data":"063fa670ab2f4bbff5f672fb0da68dd2e66732c52c9fcd5155d0bb79a64a9d81"} Apr 24 22:14:13.367491 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:13.367337 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" event={"ID":"93f4d196-25f2-47a4-8c0e-e3a2b6b2f41c","Type":"ContainerStarted","Data":"17a94963341c5618c05e99efe09e175ccf03e7435f26215943c818a0439dc251"} Apr 24 22:14:13.367491 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:13.367360 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:13.384225 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:13.384176 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" podStartSLOduration=1.3841608650000001 podStartE2EDuration="1.384160865s" podCreationTimestamp="2026-04-24 22:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:14:13.381993615 +0000 UTC m=+2812.491087099" watchObservedRunningTime="2026-04-24 22:14:13.384160865 +0000 UTC m=+2812.493254407" Apr 24 22:14:13.666398 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:13.666352 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5954c7d68c-jd96l_628cebd3-36c6-447e-be5a-217fb026b917/registry/0.log" Apr 24 22:14:13.738326 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:13.738282 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kplsb_4b00449f-18b5-4507-83ec-4a003e10f7fb/node-ca/0.log" Apr 24 22:14:14.807143 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:14.807115 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zdbnj_78d3eb37-5559-42a5-b81b-c2219787cc5b/serve-healthcheck-canary/0.log" Apr 24 22:14:15.210132 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:15.210050 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dfkzt_961f627d-2e0c-4ad8-95af-5e589bce04df/kube-rbac-proxy/0.log" Apr 24 22:14:15.229593 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:15.229571 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dfkzt_961f627d-2e0c-4ad8-95af-5e589bce04df/exporter/0.log" Apr 24 22:14:15.249894 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:15.249871 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dfkzt_961f627d-2e0c-4ad8-95af-5e589bce04df/extractor/0.log" Apr 24 22:14:17.318639 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:17.318605 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-67f77cd7d7-2fxsv_57df7126-9e58-4277-9d60-5589e17b0901/manager/0.log" Apr 24 22:14:17.339335 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:17.339297 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-bqnv4_99bfde4d-b1e5-40ab-a9de-decfb64464b1/manager/0.log" Apr 24 22:14:17.359297 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:17.359270 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-cv8gm_2ed14a9c-5fc2-4bff-9f41-8ac3ab230973/server/0.log" Apr 24 22:14:17.813911 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:17.813878 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-7twt6_f9c29421-d9fc-45e4-8f9c-91ff424e0229/manager/0.log" Apr 24 22:14:17.863326 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:17.863302 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-dwp6q_399106c1-f819-42fa-9829-852289131233/seaweedfs/0.log" Apr 24 22:14:19.380264 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:19.380239 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-ltmfk/perf-node-gather-daemonset-tbtnc" Apr 24 22:14:21.629955 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:21.629928 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qjc8k_67b04c4e-0208-4fe3-b15b-96d48c530953/migrator/0.log" Apr 24 22:14:21.651675 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:21.651645 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-qjc8k_67b04c4e-0208-4fe3-b15b-96d48c530953/graceful-termination/0.log" Apr 24 22:14:22.892439 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:22.892409 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2ptsg_cb4c0791-332c-4626-884a-8947b04761c9/kube-multus/0.log" Apr 24 22:14:23.256177 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:23.256154 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ps7q2_f4586cbe-e12f-4084-9d26-5a60d4858635/kube-multus-additional-cni-plugins/0.log" Apr 24 22:14:23.279010 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:23.278988 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ps7q2_f4586cbe-e12f-4084-9d26-5a60d4858635/egress-router-binary-copy/0.log" Apr 24 22:14:23.298061 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:23.298042 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ps7q2_f4586cbe-e12f-4084-9d26-5a60d4858635/cni-plugins/0.log" Apr 24 22:14:23.318466 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:23.318442 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ps7q2_f4586cbe-e12f-4084-9d26-5a60d4858635/bond-cni-plugin/0.log" Apr 24 22:14:23.338584 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:23.338563 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ps7q2_f4586cbe-e12f-4084-9d26-5a60d4858635/routeoverride-cni/0.log" Apr 24 22:14:23.357576 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:23.357548 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ps7q2_f4586cbe-e12f-4084-9d26-5a60d4858635/whereabouts-cni-bincopy/0.log" Apr 24 22:14:23.376518 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:23.376500 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-ps7q2_f4586cbe-e12f-4084-9d26-5a60d4858635/whereabouts-cni/0.log" Apr 24 22:14:23.441530 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:23.441503 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-489tz_5dfd7cf1-e10a-410e-b412-be269391a904/network-metrics-daemon/0.log" Apr 24 22:14:23.462086 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:23.462066 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-489tz_5dfd7cf1-e10a-410e-b412-be269391a904/kube-rbac-proxy/0.log" Apr 24 22:14:24.535212 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:24.535136 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-controller/0.log" Apr 24 22:14:24.553181 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:24.553160 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/0.log" Apr 24 22:14:24.578465 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:24.578423 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovn-acl-logging/1.log" Apr 24 22:14:24.600253 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:24.600222 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/kube-rbac-proxy-node/0.log" Apr 24 22:14:24.622566 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:24.622545 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:14:24.641816 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:24.641790 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/northd/0.log" Apr 24 22:14:24.662206 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:24.662181 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/nbdb/0.log" Apr 24 22:14:24.684162 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:24.684137 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/sbdb/0.log" Apr 24 22:14:24.859915 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:24.859834 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-8j4mf_db3d63f4-067a-47a5-b441-e08cbb119ecd/ovnkube-controller/0.log" Apr 24 22:14:26.088736 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:26.088704 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2f65f_acca6a48-f7ff-4dec-82df-945011bc308d/network-check-target-container/0.log" Apr 24 22:14:26.953907 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:26.953879 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-45fvz_ad14c53e-e5b6-4cbb-9e60-af19eb6027a6/iptables-alerter/0.log" Apr 24 22:14:27.576844 ip-10-0-130-31 kubenswrapper[2581]: I0424 22:14:27.576804 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lnqjv_84da9595-3baf-4bee-854d-b2858b093de3/tuned/0.log"