Apr 17 16:28:41.887035 ip-10-0-129-144 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:28:41.887047 ip-10-0-129-144 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:28:41.887057 ip-10-0-129-144 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:28:41.887350 ip-10-0-129-144 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:28:52.100663 ip-10-0-129-144 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:28:52.100683 ip-10-0-129-144 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e7ef0503465849feac189464a37d7bcb -- Apr 17 16:31:18.651002 ip-10-0-129-144 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:31:19.116872 ip-10-0-129-144 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:19.116872 ip-10-0-129-144 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:31:19.116872 ip-10-0-129-144 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:19.116872 ip-10-0-129-144 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:31:19.116872 ip-10-0-129-144 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:31:19.119286 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.119190 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:31:19.123171 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123155 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:19.123171 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123170 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123174 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123177 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123181 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123185 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123188 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123192 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123194 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123197 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123200 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123203 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123205 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123208 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123210 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123213 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123216 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123218 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123221 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123223 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123226 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:19.123233 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123229 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123233 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123237 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123240 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123243 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123246 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123249 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123252 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123255 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123258 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123261 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123270 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123272 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123275 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123278 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123280 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123283 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123285 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123288 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:19.123707 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123290 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123293 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123295 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123298 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123300 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123304 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123306 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123309 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123313 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123316 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123319 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123321 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123324 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123326 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123329 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123332 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123335 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123338 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123341 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123345 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:19.124188 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123347 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123350 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123353 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123355 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123358 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123361 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123363 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123366 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123368 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123371 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123374 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123377 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123379 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123382 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123384 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123387 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123389 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123392 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123394 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123397 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:19.124670 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123399 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123402 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123404 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123408 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123410 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123413 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123840 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123846 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123849 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123852 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123854 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123857 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123860 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123862 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123865 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123868 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123870 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123873 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123876 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123878 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:19.125206 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123881 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123883 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123886 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123888 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123892 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123894 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123897 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123899 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123902 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123905 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123908 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123910 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123913 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123915 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123918 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123921 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123923 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123932 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123936 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:19.125683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123938 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123941 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123944 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123946 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123948 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123951 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123953 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123956 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123960 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123964 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123967 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123974 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123978 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123980 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123983 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123985 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123988 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123991 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123993 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:19.126178 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123996 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.123998 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124001 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124003 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124007 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124009 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124012 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124014 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124017 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124019 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124021 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124024 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124027 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124030 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124032 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124039 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124043 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124046 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124048 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:19.126639 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124051 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124054 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124056 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124059 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124061 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124064 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124070 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124073 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124075 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124078 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124080 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124083 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124085 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124088 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.124090 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125697 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125707 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125714 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125719 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125723 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125726 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:31:19.127120 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125741 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125746 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125749 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125753 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125757 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125760 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125763 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125766 2574 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125770 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125773 2574 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125776 2574 flags.go:64] FLAG: --cloud-config="" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125778 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125781 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125785 2574 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125788 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125791 2574 flags.go:64] FLAG: --config-dir="" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125794 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125798 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125802 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125805 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125808 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125812 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125815 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125818 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:31:19.127632 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125821 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125824 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125827 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125831 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125834 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125837 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125840 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125843 2574 flags.go:64] FLAG: --enable-server="true" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125846 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125850 2574 flags.go:64] FLAG: --event-burst="100" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125856 2574 flags.go:64] FLAG: --event-qps="50" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125859 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125863 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125866 2574 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125869 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125872 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125875 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125879 2574 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125881 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125884 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125887 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125890 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125893 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125896 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125899 2574 flags.go:64] FLAG: --feature-gates="" Apr 17 16:31:19.128227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125903 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125906 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125910 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125914 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125917 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125920 2574 flags.go:64] FLAG: --help="false" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125922 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125925 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125928 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125931 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125935 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125938 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125941 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125944 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125946 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125949 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125952 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125960 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125963 2574 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125966 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125969 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125972 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125975 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125978 2574 flags.go:64] FLAG: --lock-file="" Apr 17 16:31:19.128947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125981 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125984 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125987 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125992 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125995 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.125998 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126001 2574 flags.go:64] FLAG: --logging-format="text" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126004 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126007 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126010 2574 flags.go:64] FLAG: --manifest-url="" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126013 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126018 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126021 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126025 2574 flags.go:64] FLAG: --max-pods="110" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126028 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126031 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126034 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126037 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126040 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126042 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126045 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126052 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126056 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126059 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:31:19.129530 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126062 2574 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126067 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126072 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126075 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126078 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126081 2574 flags.go:64] FLAG: --port="10250" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126084 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126087 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0bb85127c72632571" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126091 2574 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126094 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126097 2574 flags.go:64] FLAG: --register-node="true" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126100 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126103 2574 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126107 2574 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126110 2574 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126112 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126115 2574 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126119 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126122 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126124 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126128 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126130 2574 flags.go:64] FLAG: --runonce="false" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126134 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126137 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126139 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:31:19.130164 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126142 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126145 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126148 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126151 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126154 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126157 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126161 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126164 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126168 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126171 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126173 2574 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126176 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126182 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126185 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126188 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126192 2574 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126195 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126198 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126200 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126203 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126206 2574 flags.go:64] FLAG: --v="2" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126211 2574 flags.go:64] FLAG: --version="false" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126215 2574 flags.go:64] FLAG: --vmodule="" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126220 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.126223 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:31:19.130798 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126315 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126318 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126321 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126324 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126327 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126330 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126332 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126335 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126338 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126343 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126347 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126350 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126353 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126357 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126360 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126364 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126366 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126369 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126372 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:19.131403 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126375 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126378 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126380 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126383 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126389 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126392 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126394 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126397 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126399 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126402 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126404 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126407 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126410 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126412 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126415 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126417 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126420 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126423 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126425 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:19.131873 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126428 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126430 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126433 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126435 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126438 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126440 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126443 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126447 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126450 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126453 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126455 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126458 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126460 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126463 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126465 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126468 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126470 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126473 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126476 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126478 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:19.132373 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126481 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126483 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126486 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126488 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126491 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126494 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126498 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126501 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126504 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126506 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126508 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126512 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126514 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126517 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126519 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126522 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126524 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126527 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126530 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126533 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:19.132872 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126536 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:19.133487 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126539 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:19.133487 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126541 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:19.133487 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126544 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:19.133487 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126547 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:19.133487 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126549 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:19.133487 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126552 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:19.133487 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.126554 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:19.133487 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.127219 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:19.135363 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.135243 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:31:19.135363 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.135361 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135417 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135422 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135425 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135429 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135432 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135435 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135438 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135440 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135443 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135446 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135449 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135451 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135454 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135458 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135460 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135463 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135465 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135468 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135470 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:19.135493 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135473 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135475 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135478 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135480 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135483 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135486 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135488 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135491 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135493 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135497 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135500 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135503 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135506 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135509 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135511 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135514 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135516 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135519 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135521 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135524 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:19.136022 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135527 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135531 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135534 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135536 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135539 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135542 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135544 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135547 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135550 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135552 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135556 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135560 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135563 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135566 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135570 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135573 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135575 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135578 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135580 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:19.136513 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135583 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135586 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135588 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135591 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135593 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135596 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135598 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135601 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135604 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135606 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135608 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135611 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135614 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135616 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135619 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135622 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135625 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135627 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135631 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135634 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:19.136997 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135636 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135638 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135641 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135643 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135646 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135649 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135651 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135654 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.135659 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135770 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135776 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135779 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135782 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135785 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135787 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:31:19.137488 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135790 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135793 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135795 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135798 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135801 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135803 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135806 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135808 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135811 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135814 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135816 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135819 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135822 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135824 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135827 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135829 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135833 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135836 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135839 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135841 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:31:19.137898 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135844 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135846 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135849 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135852 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135856 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135859 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135861 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135864 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135866 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135869 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135872 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135875 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135877 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135880 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135882 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135885 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135887 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135890 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135892 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135895 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:31:19.138380 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135897 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135900 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135902 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135905 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135907 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135910 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135912 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135915 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135917 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135920 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135922 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135925 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135927 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135930 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135932 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135935 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135938 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135941 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135943 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135946 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:31:19.139029 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135948 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135951 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135953 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135956 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135958 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135961 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135963 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135967 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135970 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135973 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135976 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135979 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135982 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135985 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135987 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135990 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135993 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135995 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.135998 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:31:19.139527 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:19.136000 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:31:19.140011 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.136005 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:31:19.140011 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.136810 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:31:19.141162 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.141148 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:31:19.142074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.142062 2574 server.go:1019] "Starting client certificate rotation" Apr 17 16:31:19.142174 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.142156 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:19.142206 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.142193 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:31:19.171103 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.171075 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:19.174178 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.174158 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:31:19.188592 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.188567 2574 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:31:19.194579 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.194560 2574 log.go:25] "Validated CRI v1 image API" Apr 17 16:31:19.195901 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.195884 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:31:19.200119 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.200091 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 bd8b9ad6-1bde-4137-9b05-b3e511a508d4:/dev/nvme0n1p3 cd8d50e5-49a3-4593-90ea-cf1bc00538a3:/dev/nvme0n1p4] Apr 17 16:31:19.200181 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.200117 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:31:19.203194 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.203174 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:19.206681 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.206568 2574 manager.go:217] Machine: {Timestamp:2026-04-17 16:31:19.204712197 +0000 UTC m=+0.428039966 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100757 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec264682f2ff421fba5878728505bf28 SystemUUID:ec264682-f2ff-421f-ba58-78728505bf28 BootID:e7ef0503-4658-49fe-ac18-9464a37d7bcb Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fe:fc:63:ec:e7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fe:fc:63:ec:e7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5e:90:b3:88:3a:0b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:31:19.206681 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.206677 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:31:19.206835 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.206771 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:31:19.207912 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.207890 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:31:19.208051 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.207915 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-144.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:31:19.208097 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.208060 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:31:19.208097 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.208068 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:31:19.208097 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.208081 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:19.208863 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.208852 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:31:19.210317 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.210307 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:19.210426 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.210417 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:31:19.213333 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.213323 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:31:19.213368 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.213336 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:31:19.213368 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.213349 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:31:19.213368 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.213358 2574 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:31:19.213368 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.213367 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:31:19.214503 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.214489 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:19.214503 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.214506 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:31:19.218518 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.218502 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:31:19.219984 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.219967 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:31:19.221967 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.221954 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:31:19.221967 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.221971 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:31:19.222074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.221977 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:31:19.222074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.221983 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:31:19.222074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.221989 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:31:19.222074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.221994 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:31:19.222074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.222001 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:31:19.222074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.222006 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:31:19.222074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.222013 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:31:19.222074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.222026 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:31:19.222074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.222038 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:31:19.222074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.222046 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:31:19.223976 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.223963 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:31:19.223976 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.223975 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:31:19.225489 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.225465 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:31:19.225549 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.225485 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:31:19.225817 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.225799 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2rm5d" Apr 17 16:31:19.227793 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.227779 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:31:19.227843 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.227816 2574 server.go:1295] "Started kubelet" Apr 17 16:31:19.227897 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.227881 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:31:19.228002 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.227963 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:31:19.228055 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.228022 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:31:19.229289 ip-10-0-129-144 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:31:19.229482 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.229463 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:31:19.231032 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.231016 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:31:19.232990 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.232965 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-2rm5d" Apr 17 16:31:19.234564 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.234548 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:31:19.235550 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.234619 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-144.ec2.internal.18a731eeaeee7e25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-144.ec2.internal,UID:ip-10-0-129-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-144.ec2.internal,},FirstTimestamp:2026-04-17 16:31:19.227792933 +0000 UTC m=+0.451120705,LastTimestamp:2026-04-17 16:31:19.227792933 +0000 UTC m=+0.451120705,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-144.ec2.internal,}" Apr 17 16:31:19.238025 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.238007 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:31:19.238116 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.238028 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:19.238847 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.238832 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:31:19.238934 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.238851 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:31:19.238934 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.238861 2574 factory.go:55] Registering systemd factory Apr 17 16:31:19.238934 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.238879 2574 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:31:19.239077 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.238961 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:31:19.239077 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.239020 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:31:19.239077 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.239028 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:31:19.239077 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.239024 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:31:19.239267 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.239105 2574 factory.go:153] Registering CRI-O factory Apr 17 16:31:19.239267 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.239133 2574 factory.go:223] Registration of the crio container factory successfully Apr 17 16:31:19.239267 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.239160 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:19.239267 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.239210 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:31:19.239267 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.239232 2574 factory.go:103] Registering Raw factory Apr 17 16:31:19.239267 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.239249 2574 manager.go:1196] Started watching for new ooms in manager Apr 17 16:31:19.239693 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.239680 2574 manager.go:319] Starting recovery of all containers Apr 17 16:31:19.251075 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.251054 2574 manager.go:324] Recovery completed Apr 17 16:31:19.251836 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.251815 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:19.252516 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.252491 2574 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": readdirent /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 17 16:31:19.255378 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.255364 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:19.257716 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.257701 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:19.257795 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.257763 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:19.257795 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.257777 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:19.258092 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.258076 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-129-144.ec2.internal\" not found" node="ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.258403 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.258388 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:31:19.258403 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.258402 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:31:19.258491 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.258420 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:31:19.260918 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.260908 2574 policy_none.go:49] "None policy: Start" Apr 17 16:31:19.260960 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.260923 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:31:19.260960 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.260932 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:31:19.295413 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.295391 2574 manager.go:341] "Starting Device Plugin manager" Apr 17 16:31:19.295513 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.295450 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:31:19.295513 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.295462 2574 server.go:85] "Starting device plugin registration server" Apr 17 16:31:19.295938 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.295756 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:31:19.295938 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.295769 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:31:19.295938 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.295875 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:31:19.296140 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.295948 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:31:19.296140 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.295958 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:31:19.297312 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.297290 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:31:19.297396 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.297329 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:19.375673 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.375596 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:31:19.376790 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.376778 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:31:19.376852 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.376804 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:31:19.376852 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.376823 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:31:19.376852 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.376828 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:31:19.376982 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.376904 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:31:19.381312 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.381295 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:19.396393 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.396361 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:19.397382 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.397364 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:19.397481 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.397394 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:19.397481 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.397405 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:19.397481 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.397427 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.405936 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.405921 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.406021 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.405942 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-144.ec2.internal\": node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:19.418547 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.418530 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:19.477602 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.477558 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal"] Apr 17 16:31:19.477707 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.477649 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:19.479122 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.479102 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:19.479197 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.479135 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:19.479197 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.479149 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:19.480373 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.480359 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:19.480519 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.480504 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.480593 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.480533 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:19.481071 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.481054 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:19.481145 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.481079 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:19.481145 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.481101 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:19.481145 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.481113 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:19.481145 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.481082 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:19.481145 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.481146 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:19.482232 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.482218 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.482272 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.482243 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:31:19.482848 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.482830 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:31:19.482961 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.482858 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:31:19.482961 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.482868 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:31:19.507437 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.507413 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-144.ec2.internal\" not found" node="ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.511889 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.511873 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-144.ec2.internal\" not found" node="ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.519544 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.519530 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:19.619974 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.619951 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:19.640139 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.640083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9666662d7a9a022fd7fae8af93e7302-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal\" (UID: \"d9666662d7a9a022fd7fae8af93e7302\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.640139 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.640113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c1defed80364f63d2790d12bb9e06eb2-config\") pod \"kube-apiserver-proxy-ip-10-0-129-144.ec2.internal\" (UID: \"c1defed80364f63d2790d12bb9e06eb2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.640139 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.640131 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d9666662d7a9a022fd7fae8af93e7302-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal\" (UID: \"d9666662d7a9a022fd7fae8af93e7302\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.720463 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.720420 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:19.740743 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.740705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d9666662d7a9a022fd7fae8af93e7302-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal\" (UID: \"d9666662d7a9a022fd7fae8af93e7302\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.740800 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.740748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9666662d7a9a022fd7fae8af93e7302-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal\" (UID: \"d9666662d7a9a022fd7fae8af93e7302\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.740800 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.740780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c1defed80364f63d2790d12bb9e06eb2-config\") pod \"kube-apiserver-proxy-ip-10-0-129-144.ec2.internal\" (UID: \"c1defed80364f63d2790d12bb9e06eb2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.740860 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.740811 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c1defed80364f63d2790d12bb9e06eb2-config\") pod \"kube-apiserver-proxy-ip-10-0-129-144.ec2.internal\" (UID: \"c1defed80364f63d2790d12bb9e06eb2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.740860 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.740827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9666662d7a9a022fd7fae8af93e7302-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal\" (UID: \"d9666662d7a9a022fd7fae8af93e7302\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.740860 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.740811 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d9666662d7a9a022fd7fae8af93e7302-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal\" (UID: \"d9666662d7a9a022fd7fae8af93e7302\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.811911 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.811871 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.814635 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:19.814618 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal" Apr 17 16:31:19.821461 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.821438 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:19.922016 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:19.921916 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:20.022392 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:20.022366 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:20.122947 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:20.122918 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:20.143253 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.143233 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:31:20.143386 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.143368 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:20.143434 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.143387 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:31:20.165622 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.165595 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:20.223471 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:20.223394 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:20.235584 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.235540 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:26:19 +0000 UTC" deadline="2027-11-20 04:18:43.080728674 +0000 UTC" Apr 17 16:31:20.235584 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.235579 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13955h47m22.845152633s" Apr 17 16:31:20.238960 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.238944 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:31:20.253676 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.253643 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:31:20.280952 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.280924 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-fhwm9" Apr 17 16:31:20.290775 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.290752 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-fhwm9" Apr 17 16:31:20.324428 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:20.324401 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-144.ec2.internal\" not found" Apr 17 16:31:20.369275 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.369246 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:20.372683 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:20.372651 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9666662d7a9a022fd7fae8af93e7302.slice/crio-edf0a23766fc033a50cdef004fa2c8257d4810f539dcd4544efb20a624955a7f WatchSource:0}: Error finding container edf0a23766fc033a50cdef004fa2c8257d4810f539dcd4544efb20a624955a7f: Status 404 returned error can't find the container with id edf0a23766fc033a50cdef004fa2c8257d4810f539dcd4544efb20a624955a7f Apr 17 16:31:20.373003 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:20.372976 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1defed80364f63d2790d12bb9e06eb2.slice/crio-20bcc90a70062ba313e26c438af7ec0f14f3af9150336d289087eb4103d6dc82 WatchSource:0}: Error finding container 20bcc90a70062ba313e26c438af7ec0f14f3af9150336d289087eb4103d6dc82: Status 404 returned error can't find the container with id 20bcc90a70062ba313e26c438af7ec0f14f3af9150336d289087eb4103d6dc82 Apr 17 16:31:20.376696 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.376683 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:20.379936 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.379900 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal" event={"ID":"c1defed80364f63d2790d12bb9e06eb2","Type":"ContainerStarted","Data":"20bcc90a70062ba313e26c438af7ec0f14f3af9150336d289087eb4103d6dc82"} Apr 17 16:31:20.380835 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.380808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" event={"ID":"d9666662d7a9a022fd7fae8af93e7302","Type":"ContainerStarted","Data":"edf0a23766fc033a50cdef004fa2c8257d4810f539dcd4544efb20a624955a7f"} Apr 17 16:31:20.439590 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.439570 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" Apr 17 16:31:20.451878 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.451859 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:20.452811 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.452799 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal" Apr 17 16:31:20.459314 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.459294 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:31:20.969128 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:20.968911 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:21.215127 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.215089 2574 apiserver.go:52] "Watching apiserver" Apr 17 16:31:21.223847 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.223785 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:31:21.225718 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.225683 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-gwj7d","openshift-network-operator/iptables-alerter-tm6mx","openshift-ovn-kubernetes/ovnkube-node-bv26r","kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw","openshift-image-registry/node-ca-6jdwh","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal","openshift-multus/multus-additional-cni-plugins-6m4wt","kube-system/konnectivity-agent-xztmj","openshift-cluster-node-tuning-operator/tuned-4nbnb","openshift-dns/node-resolver-pbmrc","openshift-multus/multus-t7lqr","openshift-multus/network-metrics-daemon-42tgv"] Apr 17 16:31:21.229238 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.229218 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.229468 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.229392 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.232353 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.231789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.232353 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.231821 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:21.232353 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.231886 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:21.232353 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.232055 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:31:21.232353 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.232047 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:21.232353 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.232200 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:31:21.232353 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.232323 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:31:21.232759 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.232326 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xhmkt\"" Apr 17 16:31:21.233140 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.233122 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-d6lcz\"" Apr 17 16:31:21.233268 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.233251 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:31:21.233349 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.233335 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:31:21.233524 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.233511 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:21.234529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.234509 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.235014 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.234994 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:31:21.235708 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.235677 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-z8mmd\"" Apr 17 16:31:21.235950 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.235933 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.236462 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.236263 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:31:21.236462 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.236279 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:31:21.236462 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.236384 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:31:21.239633 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.237330 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:21.239633 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.237407 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:21.239633 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.237602 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:31:21.239633 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.237816 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.239633 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.237877 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:31:21.239633 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.238036 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:31:21.239633 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.238389 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:31:21.239633 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.238722 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:31:21.240967 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.240442 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:31:21.240967 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.240654 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:31:21.240967 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.240899 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fldzd\"" Apr 17 16:31:21.241324 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.241307 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fcg7b\"" Apr 17 16:31:21.243326 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.243308 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xxvfp\"" Apr 17 16:31:21.243505 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.243464 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:31:21.243505 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.243488 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:31:21.245100 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.245081 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:31:21.245614 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.245593 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:21.245989 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.245707 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.247109 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.247089 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.247914 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.247891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9flm\" (UniqueName: \"kubernetes.io/projected/b57612e9-f335-4d71-bdba-f06f0735eee1-kube-api-access-s9flm\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:21.248011 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.247927 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-device-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.248011 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.247952 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-kubelet\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.248011 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.247976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-run-netns\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.248181 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248036 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-os-release\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.248181 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-registration-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.248181 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248105 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b67a3101-cd16-466d-bb65-1fcd6158a8f4-konnectivity-ca\") pod \"konnectivity-agent-xztmj\" (UID: \"b67a3101-cd16-466d-bb65-1fcd6158a8f4\") " pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:21.248181 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248130 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-var-lib-openvswitch\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.248181 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248165 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-run-openvswitch\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.248444 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248222 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/318876c9-9879-4dc1-a7cb-9664f3196aa9-host-slash\") pod \"iptables-alerter-tm6mx\" (UID: \"318876c9-9879-4dc1-a7cb-9664f3196aa9\") " pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.248444 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.248444 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248351 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-cni-dir\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.248444 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-run-multus-certs\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.248444 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-system-cni-dir\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248483 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-etc-selinux\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248568 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-system-cni-dir\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248600 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-var-lib-cni-bin\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248626 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e613ab88-5ce8-4dda-a30c-57006804fdb0-host\") pod \"node-ca-6jdwh\" (UID: \"e613ab88-5ce8-4dda-a30c-57006804fdb0\") " pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-cnibin\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248679 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th2xn\" (UniqueName: \"kubernetes.io/projected/2a3eeab2-52f2-4ba5-a534-bef4430448f8-kube-api-access-th2xn\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248705 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-node-log\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248757 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-etc-kubernetes\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248790 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/677ce4c5-2494-4409-bb8c-263a71ca26d1-env-overrides\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.248844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248806 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-cni-bin\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248865 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-977m4\" (UniqueName: \"kubernetes.io/projected/677ce4c5-2494-4409-bb8c-263a71ca26d1-kube-api-access-977m4\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-var-lib-kubelet\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-conf-dir\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248942 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-daemon-config\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.248990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b67a3101-cd16-466d-bb65-1fcd6158a8f4-agent-certs\") pod \"konnectivity-agent-xztmj\" (UID: \"b67a3101-cd16-466d-bb65-1fcd6158a8f4\") " pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-os-release\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-socket-dir-parent\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249065 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/318876c9-9879-4dc1-a7cb-9664f3196aa9-iptables-alerter-script\") pod \"iptables-alerter-tm6mx\" (UID: \"318876c9-9879-4dc1-a7cb-9664f3196aa9\") " pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-systemd-units\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249112 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-hostroot\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249144 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-run-systemd\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-cni-netd\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249221 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x24dj\" (UniqueName: \"kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj\") pod \"network-check-target-gwj7d\" (UID: \"cad60853-6c2f-411f-935c-f5890843bbf1\") " pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a3eeab2-52f2-4ba5-a534-bef4430448f8-cni-binary-copy\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.249384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-run-ovn\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249293 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249316 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-var-lib-cni-multus\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249339 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlhmd\" (UniqueName: \"kubernetes.io/projected/1a680fa9-f376-482e-af97-a722bc5b37c6-kube-api-access-rlhmd\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-slash\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249381 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249385 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfwvs\" (UniqueName: \"kubernetes.io/projected/318876c9-9879-4dc1-a7cb-9664f3196aa9-kube-api-access-zfwvs\") pod \"iptables-alerter-tm6mx\" (UID: \"318876c9-9879-4dc1-a7cb-9664f3196aa9\") " pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249411 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-etc-openvswitch\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249435 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-run-netns\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249467 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-cnibin\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249502 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/677ce4c5-2494-4409-bb8c-263a71ca26d1-ovnkube-config\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/677ce4c5-2494-4409-bb8c-263a71ca26d1-ovn-node-metrics-cert\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-run-k8s-cni-cncf-io\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e613ab88-5ce8-4dda-a30c-57006804fdb0-serviceca\") pod \"node-ca-6jdwh\" (UID: \"e613ab88-5ce8-4dda-a30c-57006804fdb0\") " pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249636 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjnhd\" (UniqueName: \"kubernetes.io/projected/e613ab88-5ce8-4dda-a30c-57006804fdb0-kube-api-access-xjnhd\") pod \"node-ca-6jdwh\" (UID: \"e613ab88-5ce8-4dda-a30c-57006804fdb0\") " pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a3eeab2-52f2-4ba5-a534-bef4430448f8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249703 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-log-socket\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.250157 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249712 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249819 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249851 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/677ce4c5-2494-4409-bb8c-263a71ca26d1-ovnkube-script-lib\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-socket-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249913 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v7ztv\"" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249900 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-sys-fs\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249956 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbvz\" (UniqueName: \"kubernetes.io/projected/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-kube-api-access-vjbvz\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.249981 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1a680fa9-f376-482e-af97-a722bc5b37c6-cni-binary-copy\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.250006 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a3eeab2-52f2-4ba5-a534-bef4430448f8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.250097 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-b82wc\"" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.250133 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.250290 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.250398 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.250511 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:31:21.250956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.250590 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-xwt9g\"" Apr 17 16:31:21.291808 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.291777 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:20 +0000 UTC" deadline="2028-01-17 16:19:29.981321732 +0000 UTC" Apr 17 16:31:21.291808 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.291807 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15359h48m8.689518544s" Apr 17 16:31:21.340503 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.340431 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:31:21.350294 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/677ce4c5-2494-4409-bb8c-263a71ca26d1-ovnkube-script-lib\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.350410 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-socket-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.350410 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-sys-fs\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.350410 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350351 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbvz\" (UniqueName: \"kubernetes.io/projected/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-kube-api-access-vjbvz\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.350410 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350375 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1a680fa9-f376-482e-af97-a722bc5b37c6-cni-binary-copy\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.350410 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a3eeab2-52f2-4ba5-a534-bef4430448f8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.350661 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-host\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.350661 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9flm\" (UniqueName: \"kubernetes.io/projected/b57612e9-f335-4d71-bdba-f06f0735eee1-kube-api-access-s9flm\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:21.350661 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-device-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.350661 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350518 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-kubelet\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.350661 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350543 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-run-netns\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.350661 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-sysctl-d\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.350661 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350600 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-systemd\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.350661 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-os-release\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350665 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-modprobe-d\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec869d90-6e5f-4329-9d3c-62938cb140e5-tmp-dir\") pod \"node-resolver-pbmrc\" (UID: \"ec869d90-6e5f-4329-9d3c-62938cb140e5\") " pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350714 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbm28\" (UniqueName: \"kubernetes.io/projected/ec869d90-6e5f-4329-9d3c-62938cb140e5-kube-api-access-gbm28\") pod \"node-resolver-pbmrc\" (UID: \"ec869d90-6e5f-4329-9d3c-62938cb140e5\") " pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350753 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-run-netns\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-registration-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-device-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350810 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b67a3101-cd16-466d-bb65-1fcd6158a8f4-konnectivity-ca\") pod \"konnectivity-agent-xztmj\" (UID: \"b67a3101-cd16-466d-bb65-1fcd6158a8f4\") " pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-registration-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350858 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-var-lib-openvswitch\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-run-openvswitch\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-var-lib-openvswitch\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/677ce4c5-2494-4409-bb8c-263a71ca26d1-ovnkube-script-lib\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.350949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-sys-fs\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351010 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-os-release\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.351062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-run-openvswitch\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-socket-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351217 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-kubelet\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351218 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b67a3101-cd16-466d-bb65-1fcd6158a8f4-konnectivity-ca\") pod \"konnectivity-agent-xztmj\" (UID: \"b67a3101-cd16-466d-bb65-1fcd6158a8f4\") " pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351272 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-var-lib-kubelet\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/318876c9-9879-4dc1-a7cb-9664f3196aa9-host-slash\") pod \"iptables-alerter-tm6mx\" (UID: \"318876c9-9879-4dc1-a7cb-9664f3196aa9\") " pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-cni-dir\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351378 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-run-multus-certs\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351376 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/318876c9-9879-4dc1-a7cb-9664f3196aa9-host-slash\") pod \"iptables-alerter-tm6mx\" (UID: \"318876c9-9879-4dc1-a7cb-9664f3196aa9\") " pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-system-cni-dir\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-etc-selinux\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351442 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-system-cni-dir\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351447 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-system-cni-dir\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351409 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1a680fa9-f376-482e-af97-a722bc5b37c6-cni-binary-copy\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.351704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-var-lib-cni-bin\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-run-multus-certs\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351447 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-cni-dir\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-etc-selinux\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e613ab88-5ce8-4dda-a30c-57006804fdb0-host\") pod \"node-ca-6jdwh\" (UID: \"e613ab88-5ce8-4dda-a30c-57006804fdb0\") " pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351538 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-cnibin\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351540 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e613ab88-5ce8-4dda-a30c-57006804fdb0-host\") pod \"node-ca-6jdwh\" (UID: \"e613ab88-5ce8-4dda-a30c-57006804fdb0\") " pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351565 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351569 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th2xn\" (UniqueName: \"kubernetes.io/projected/2a3eeab2-52f2-4ba5-a534-bef4430448f8-kube-api-access-th2xn\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-system-cni-dir\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-node-log\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351578 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a3eeab2-52f2-4ba5-a534-bef4430448f8-cnibin\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351625 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-var-lib-cni-bin\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-etc-kubernetes\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/677ce4c5-2494-4409-bb8c-263a71ca26d1-env-overrides\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351664 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-node-log\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351662 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-etc-kubernetes\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-sysconfig\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.352498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-kubernetes\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwftm\" (UniqueName: \"kubernetes.io/projected/e4399b5a-cfeb-4136-9844-e59d92f13af1-kube-api-access-lwftm\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-cni-bin\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351831 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-977m4\" (UniqueName: \"kubernetes.io/projected/677ce4c5-2494-4409-bb8c-263a71ca26d1-kube-api-access-977m4\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351873 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-cni-bin\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-var-lib-kubelet\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351931 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-conf-dir\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.351958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-daemon-config\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-var-lib-kubelet\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352038 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b67a3101-cd16-466d-bb65-1fcd6158a8f4-agent-certs\") pod \"konnectivity-agent-xztmj\" (UID: \"b67a3101-cd16-466d-bb65-1fcd6158a8f4\") " pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352072 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-conf-dir\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352039 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/677ce4c5-2494-4409-bb8c-263a71ca26d1-env-overrides\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352078 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-os-release\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352095 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2a3eeab2-52f2-4ba5-a534-bef4430448f8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-socket-dir-parent\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-os-release\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352161 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/318876c9-9879-4dc1-a7cb-9664f3196aa9-iptables-alerter-script\") pod \"iptables-alerter-tm6mx\" (UID: \"318876c9-9879-4dc1-a7cb-9664f3196aa9\") " pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352173 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-socket-dir-parent\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-systemd-units\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-hostroot\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-run-systemd\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-cni-netd\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352594 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-sys\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352625 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-tuned\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x24dj\" (UniqueName: \"kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj\") pod \"network-check-target-gwj7d\" (UID: \"cad60853-6c2f-411f-935c-f5890843bbf1\") " pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a3eeab2-52f2-4ba5-a534-bef4430448f8-cni-binary-copy\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352750 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-run\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352781 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-lib-modules\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352812 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-run-ovn\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352845 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-var-lib-cni-multus\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlhmd\" (UniqueName: \"kubernetes.io/projected/1a680fa9-f376-482e-af97-a722bc5b37c6-kube-api-access-rlhmd\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352928 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-slash\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352958 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ec869d90-6e5f-4329-9d3c-62938cb140e5-hosts-file\") pod \"node-resolver-pbmrc\" (UID: \"ec869d90-6e5f-4329-9d3c-62938cb140e5\") " pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.353529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.352989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfwvs\" (UniqueName: \"kubernetes.io/projected/318876c9-9879-4dc1-a7cb-9664f3196aa9-kube-api-access-zfwvs\") pod \"iptables-alerter-tm6mx\" (UID: \"318876c9-9879-4dc1-a7cb-9664f3196aa9\") " pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-etc-openvswitch\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353045 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-run-netns\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-cnibin\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353108 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/677ce4c5-2494-4409-bb8c-263a71ca26d1-ovnkube-config\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/677ce4c5-2494-4409-bb8c-263a71ca26d1-ovn-node-metrics-cert\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353405 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-run-ovn-kubernetes\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-systemd-units\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353624 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1a680fa9-f376-482e-af97-a722bc5b37c6-multus-daemon-config\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353690 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-cni-netd\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353837 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-run-systemd\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353873 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-hostroot\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.354158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.353937 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.354395 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-etc-openvswitch\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.354625 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.354648 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-run-netns\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.354721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-cnibin\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.354759 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs podName:b57612e9-f335-4d71-bdba-f06f0735eee1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:21.854684536 +0000 UTC m=+3.078012296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs") pod "network-metrics-daemon-42tgv" (UID: "b57612e9-f335-4d71-bdba-f06f0735eee1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.354807 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-var-lib-cni-multus\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.354851 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-run-ovn\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.354938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/677ce4c5-2494-4409-bb8c-263a71ca26d1-ovnkube-config\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.354948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a3eeab2-52f2-4ba5-a534-bef4430448f8-cni-binary-copy\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355006 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-slash\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355139 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-sysctl-conf\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355181 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4399b5a-cfeb-4136-9844-e59d92f13af1-tmp\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-run-k8s-cni-cncf-io\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355256 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e613ab88-5ce8-4dda-a30c-57006804fdb0-serviceca\") pod \"node-ca-6jdwh\" (UID: \"e613ab88-5ce8-4dda-a30c-57006804fdb0\") " pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355292 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjnhd\" (UniqueName: \"kubernetes.io/projected/e613ab88-5ce8-4dda-a30c-57006804fdb0-kube-api-access-xjnhd\") pod \"node-ca-6jdwh\" (UID: \"e613ab88-5ce8-4dda-a30c-57006804fdb0\") " pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355310 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1a680fa9-f376-482e-af97-a722bc5b37c6-host-run-k8s-cni-cncf-io\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.355335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/318876c9-9879-4dc1-a7cb-9664f3196aa9-iptables-alerter-script\") pod \"iptables-alerter-tm6mx\" (UID: \"318876c9-9879-4dc1-a7cb-9664f3196aa9\") " pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.356154 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355329 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a3eeab2-52f2-4ba5-a534-bef4430448f8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.356154 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355356 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-log-socket\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.356154 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355389 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.356154 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.356154 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/677ce4c5-2494-4409-bb8c-263a71ca26d1-log-socket\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.356154 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e613ab88-5ce8-4dda-a30c-57006804fdb0-serviceca\") pod \"node-ca-6jdwh\" (UID: \"e613ab88-5ce8-4dda-a30c-57006804fdb0\") " pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.356154 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.355939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2a3eeab2-52f2-4ba5-a534-bef4430448f8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.359681 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.359659 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/677ce4c5-2494-4409-bb8c-263a71ca26d1-ovn-node-metrics-cert\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.360169 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.360150 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b67a3101-cd16-466d-bb65-1fcd6158a8f4-agent-certs\") pod \"konnectivity-agent-xztmj\" (UID: \"b67a3101-cd16-466d-bb65-1fcd6158a8f4\") " pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:21.374244 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.374223 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:21.374330 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.374246 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:21.374330 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.374260 2574 projected.go:194] Error preparing data for projected volume kube-api-access-x24dj for pod openshift-network-diagnostics/network-check-target-gwj7d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:21.374330 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.374331 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj podName:cad60853-6c2f-411f-935c-f5890843bbf1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:21.874314651 +0000 UTC m=+3.097642427 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x24dj" (UniqueName: "kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj") pod "network-check-target-gwj7d" (UID: "cad60853-6c2f-411f-935c-f5890843bbf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:21.376884 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.376831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th2xn\" (UniqueName: \"kubernetes.io/projected/2a3eeab2-52f2-4ba5-a534-bef4430448f8-kube-api-access-th2xn\") pod \"multus-additional-cni-plugins-6m4wt\" (UID: \"2a3eeab2-52f2-4ba5-a534-bef4430448f8\") " pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.378559 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.378497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-977m4\" (UniqueName: \"kubernetes.io/projected/677ce4c5-2494-4409-bb8c-263a71ca26d1-kube-api-access-977m4\") pod \"ovnkube-node-bv26r\" (UID: \"677ce4c5-2494-4409-bb8c-263a71ca26d1\") " pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.379696 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.379677 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlhmd\" (UniqueName: \"kubernetes.io/projected/1a680fa9-f376-482e-af97-a722bc5b37c6-kube-api-access-rlhmd\") pod \"multus-t7lqr\" (UID: \"1a680fa9-f376-482e-af97-a722bc5b37c6\") " pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.381888 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.381865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbvz\" (UniqueName: \"kubernetes.io/projected/390c549d-4a2a-4d76-8395-5efa4f2e8a4e-kube-api-access-vjbvz\") pod \"aws-ebs-csi-driver-node-wkbvw\" (UID: \"390c549d-4a2a-4d76-8395-5efa4f2e8a4e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.384130 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.384105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfwvs\" (UniqueName: \"kubernetes.io/projected/318876c9-9879-4dc1-a7cb-9664f3196aa9-kube-api-access-zfwvs\") pod \"iptables-alerter-tm6mx\" (UID: \"318876c9-9879-4dc1-a7cb-9664f3196aa9\") " pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.384617 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.384593 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9flm\" (UniqueName: \"kubernetes.io/projected/b57612e9-f335-4d71-bdba-f06f0735eee1-kube-api-access-s9flm\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:21.386130 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.386105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjnhd\" (UniqueName: \"kubernetes.io/projected/e613ab88-5ce8-4dda-a30c-57006804fdb0-kube-api-access-xjnhd\") pod \"node-ca-6jdwh\" (UID: \"e613ab88-5ce8-4dda-a30c-57006804fdb0\") " pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.456141 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456114 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-sys\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456141 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456145 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-tuned\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456323 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456176 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-run\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456323 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-lib-modules\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456323 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456238 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ec869d90-6e5f-4329-9d3c-62938cb140e5-hosts-file\") pod \"node-resolver-pbmrc\" (UID: \"ec869d90-6e5f-4329-9d3c-62938cb140e5\") " pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.456323 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-sys\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456323 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456261 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-sysctl-conf\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456323 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456281 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-run\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456323 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ec869d90-6e5f-4329-9d3c-62938cb140e5-hosts-file\") pod \"node-resolver-pbmrc\" (UID: \"ec869d90-6e5f-4329-9d3c-62938cb140e5\") " pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.456323 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456320 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4399b5a-cfeb-4136-9844-e59d92f13af1-tmp\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-host\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-lib-modules\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-sysctl-d\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-systemd\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456419 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-sysctl-conf\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-modprobe-d\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456444 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-host\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec869d90-6e5f-4329-9d3c-62938cb140e5-tmp-dir\") pod \"node-resolver-pbmrc\" (UID: \"ec869d90-6e5f-4329-9d3c-62938cb140e5\") " pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbm28\" (UniqueName: \"kubernetes.io/projected/ec869d90-6e5f-4329-9d3c-62938cb140e5-kube-api-access-gbm28\") pod \"node-resolver-pbmrc\" (UID: \"ec869d90-6e5f-4329-9d3c-62938cb140e5\") " pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456520 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-var-lib-kubelet\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-sysctl-d\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-modprobe-d\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456539 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-systemd\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456574 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-sysconfig\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456581 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-var-lib-kubelet\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456600 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-kubernetes\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-sysconfig\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.456642 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwftm\" (UniqueName: \"kubernetes.io/projected/e4399b5a-cfeb-4136-9844-e59d92f13af1-kube-api-access-lwftm\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.457411 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456696 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-kubernetes\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.457411 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.456823 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ec869d90-6e5f-4329-9d3c-62938cb140e5-tmp-dir\") pod \"node-resolver-pbmrc\" (UID: \"ec869d90-6e5f-4329-9d3c-62938cb140e5\") " pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.458713 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.458690 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4399b5a-cfeb-4136-9844-e59d92f13af1-etc-tuned\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.459308 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.459286 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4399b5a-cfeb-4136-9844-e59d92f13af1-tmp\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.477257 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.477208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwftm\" (UniqueName: \"kubernetes.io/projected/e4399b5a-cfeb-4136-9844-e59d92f13af1-kube-api-access-lwftm\") pod \"tuned-4nbnb\" (UID: \"e4399b5a-cfeb-4136-9844-e59d92f13af1\") " pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.480538 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.480519 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbm28\" (UniqueName: \"kubernetes.io/projected/ec869d90-6e5f-4329-9d3c-62938cb140e5-kube-api-access-gbm28\") pod \"node-resolver-pbmrc\" (UID: \"ec869d90-6e5f-4329-9d3c-62938cb140e5\") " pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.485551 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.485530 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:31:21.542641 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.542601 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t7lqr" Apr 17 16:31:21.553173 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.553153 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tm6mx" Apr 17 16:31:21.561937 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.561910 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:21.568512 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.568483 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" Apr 17 16:31:21.574983 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.574963 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6jdwh" Apr 17 16:31:21.582519 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.582492 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" Apr 17 16:31:21.590054 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.590035 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:21.597618 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.597596 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" Apr 17 16:31:21.605131 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.605113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-pbmrc" Apr 17 16:31:21.859694 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.859624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:21.859860 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.859758 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:21.859860 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.859822 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs podName:b57612e9-f335-4d71-bdba-f06f0735eee1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:22.859805782 +0000 UTC m=+4.083133538 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs") pod "network-metrics-daemon-42tgv" (UID: "b57612e9-f335-4d71-bdba-f06f0735eee1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:21.960301 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:21.960272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x24dj\" (UniqueName: \"kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj\") pod \"network-check-target-gwj7d\" (UID: \"cad60853-6c2f-411f-935c-f5890843bbf1\") " pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:21.960469 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.960427 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:21.960469 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.960454 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:21.960469 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.960467 2574 projected.go:194] Error preparing data for projected volume kube-api-access-x24dj for pod openshift-network-diagnostics/network-check-target-gwj7d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:21.960585 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:21.960531 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj podName:cad60853-6c2f-411f-935c-f5890843bbf1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:22.960511178 +0000 UTC m=+4.183838937 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x24dj" (UniqueName: "kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj") pod "network-check-target-gwj7d" (UID: "cad60853-6c2f-411f-935c-f5890843bbf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:22.220050 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:22.220022 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677ce4c5_2494_4409_bb8c_263a71ca26d1.slice/crio-052b1d8bfbd09d01bb37c902ec9f1201e649284e151c5dcb22a7bfc25276ca2b WatchSource:0}: Error finding container 052b1d8bfbd09d01bb37c902ec9f1201e649284e151c5dcb22a7bfc25276ca2b: Status 404 returned error can't find the container with id 052b1d8bfbd09d01bb37c902ec9f1201e649284e151c5dcb22a7bfc25276ca2b Apr 17 16:31:22.221458 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:22.221432 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode613ab88_5ce8_4dda_a30c_57006804fdb0.slice/crio-61ec07a1a23be9a0e2a96f08b4500078d51dc38c4f5382645fb18ff5692df082 WatchSource:0}: Error finding container 61ec07a1a23be9a0e2a96f08b4500078d51dc38c4f5382645fb18ff5692df082: Status 404 returned error can't find the container with id 61ec07a1a23be9a0e2a96f08b4500078d51dc38c4f5382645fb18ff5692df082 Apr 17 16:31:22.222499 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:22.222476 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec869d90_6e5f_4329_9d3c_62938cb140e5.slice/crio-0b75da2f36e3011a0b5b9cbf219a928d610de57cf846c6b163f4d8be53bc26e9 WatchSource:0}: Error finding container 0b75da2f36e3011a0b5b9cbf219a928d610de57cf846c6b163f4d8be53bc26e9: Status 404 returned error can't find the container with id 0b75da2f36e3011a0b5b9cbf219a928d610de57cf846c6b163f4d8be53bc26e9 Apr 17 16:31:22.223724 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:22.223687 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a680fa9_f376_482e_af97_a722bc5b37c6.slice/crio-d27386a9625ff19633b1ee05a2153b296d3a2d0221beb74d647fa772b52c319e WatchSource:0}: Error finding container d27386a9625ff19633b1ee05a2153b296d3a2d0221beb74d647fa772b52c319e: Status 404 returned error can't find the container with id d27386a9625ff19633b1ee05a2153b296d3a2d0221beb74d647fa772b52c319e Apr 17 16:31:22.226852 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:22.226822 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4399b5a_cfeb_4136_9844_e59d92f13af1.slice/crio-02d4d65960709b510f30ef7bbbc5011f4ccdb60a38c10051893bc97a124cb3ec WatchSource:0}: Error finding container 02d4d65960709b510f30ef7bbbc5011f4ccdb60a38c10051893bc97a124cb3ec: Status 404 returned error can't find the container with id 02d4d65960709b510f30ef7bbbc5011f4ccdb60a38c10051893bc97a124cb3ec Apr 17 16:31:22.228085 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:22.228062 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod390c549d_4a2a_4d76_8395_5efa4f2e8a4e.slice/crio-ebaac893ed0c140d6b1d8491396b756ce88c64e8e79c598ebc5552b94d1fff28 WatchSource:0}: Error finding container ebaac893ed0c140d6b1d8491396b756ce88c64e8e79c598ebc5552b94d1fff28: Status 404 returned error can't find the container with id ebaac893ed0c140d6b1d8491396b756ce88c64e8e79c598ebc5552b94d1fff28 Apr 17 16:31:22.229489 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:22.228865 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb67a3101_cd16_466d_bb65_1fcd6158a8f4.slice/crio-30070f375bc9ecffdb3ca957837d5a410a037fccbce699d524df0fc2638a3f78 WatchSource:0}: Error finding container 30070f375bc9ecffdb3ca957837d5a410a037fccbce699d524df0fc2638a3f78: Status 404 returned error can't find the container with id 30070f375bc9ecffdb3ca957837d5a410a037fccbce699d524df0fc2638a3f78 Apr 17 16:31:22.229811 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:22.229789 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a3eeab2_52f2_4ba5_a534_bef4430448f8.slice/crio-a5e418efcbd93d98dec92511722495021d069977628e4e4bbbcd5cf220ffa343 WatchSource:0}: Error finding container a5e418efcbd93d98dec92511722495021d069977628e4e4bbbcd5cf220ffa343: Status 404 returned error can't find the container with id a5e418efcbd93d98dec92511722495021d069977628e4e4bbbcd5cf220ffa343 Apr 17 16:31:22.231174 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:31:22.231153 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod318876c9_9879_4dc1_a7cb_9664f3196aa9.slice/crio-8c788fbb724e7dc940e5e2d0282eff143537e3fecb456fbe1cc795c7fe9e6e1e WatchSource:0}: Error finding container 8c788fbb724e7dc940e5e2d0282eff143537e3fecb456fbe1cc795c7fe9e6e1e: Status 404 returned error can't find the container with id 8c788fbb724e7dc940e5e2d0282eff143537e3fecb456fbe1cc795c7fe9e6e1e Apr 17 16:31:22.292109 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.292078 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:26:20 +0000 UTC" deadline="2027-10-16 23:55:04.477009159 +0000 UTC" Apr 17 16:31:22.292109 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.292107 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13135h23m42.184904994s" Apr 17 16:31:22.377679 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.377648 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:22.377819 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:22.377786 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:22.385042 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.385013 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal" event={"ID":"c1defed80364f63d2790d12bb9e06eb2","Type":"ContainerStarted","Data":"7408e272cf6e6bc64ffd0137b5b50cb25a0888f850aa6b777623339cb2295895"} Apr 17 16:31:22.386090 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.386065 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" event={"ID":"390c549d-4a2a-4d76-8395-5efa4f2e8a4e","Type":"ContainerStarted","Data":"ebaac893ed0c140d6b1d8491396b756ce88c64e8e79c598ebc5552b94d1fff28"} Apr 17 16:31:22.386967 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.386948 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t7lqr" event={"ID":"1a680fa9-f376-482e-af97-a722bc5b37c6","Type":"ContainerStarted","Data":"d27386a9625ff19633b1ee05a2153b296d3a2d0221beb74d647fa772b52c319e"} Apr 17 16:31:22.387874 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.387852 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pbmrc" event={"ID":"ec869d90-6e5f-4329-9d3c-62938cb140e5","Type":"ContainerStarted","Data":"0b75da2f36e3011a0b5b9cbf219a928d610de57cf846c6b163f4d8be53bc26e9"} Apr 17 16:31:22.388884 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.388860 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" event={"ID":"677ce4c5-2494-4409-bb8c-263a71ca26d1","Type":"ContainerStarted","Data":"052b1d8bfbd09d01bb37c902ec9f1201e649284e151c5dcb22a7bfc25276ca2b"} Apr 17 16:31:22.389774 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.389753 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tm6mx" event={"ID":"318876c9-9879-4dc1-a7cb-9664f3196aa9","Type":"ContainerStarted","Data":"8c788fbb724e7dc940e5e2d0282eff143537e3fecb456fbe1cc795c7fe9e6e1e"} Apr 17 16:31:22.390546 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.390526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" event={"ID":"2a3eeab2-52f2-4ba5-a534-bef4430448f8","Type":"ContainerStarted","Data":"a5e418efcbd93d98dec92511722495021d069977628e4e4bbbcd5cf220ffa343"} Apr 17 16:31:22.391439 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.391419 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xztmj" event={"ID":"b67a3101-cd16-466d-bb65-1fcd6158a8f4","Type":"ContainerStarted","Data":"30070f375bc9ecffdb3ca957837d5a410a037fccbce699d524df0fc2638a3f78"} Apr 17 16:31:22.392323 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.392305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" event={"ID":"e4399b5a-cfeb-4136-9844-e59d92f13af1","Type":"ContainerStarted","Data":"02d4d65960709b510f30ef7bbbc5011f4ccdb60a38c10051893bc97a124cb3ec"} Apr 17 16:31:22.393233 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.393211 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6jdwh" event={"ID":"e613ab88-5ce8-4dda-a30c-57006804fdb0","Type":"ContainerStarted","Data":"61ec07a1a23be9a0e2a96f08b4500078d51dc38c4f5382645fb18ff5692df082"} Apr 17 16:31:22.397268 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.397234 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-144.ec2.internal" podStartSLOduration=2.397224192 podStartE2EDuration="2.397224192s" podCreationTimestamp="2026-04-17 16:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:22.396810505 +0000 UTC m=+3.620138294" watchObservedRunningTime="2026-04-17 16:31:22.397224192 +0000 UTC m=+3.620551969" Apr 17 16:31:22.873418 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.871354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:22.873418 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:22.871512 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:22.873418 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:22.871570 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs podName:b57612e9-f335-4d71-bdba-f06f0735eee1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:24.871551806 +0000 UTC m=+6.094879566 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs") pod "network-metrics-daemon-42tgv" (UID: "b57612e9-f335-4d71-bdba-f06f0735eee1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:22.972185 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:22.972146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x24dj\" (UniqueName: \"kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj\") pod \"network-check-target-gwj7d\" (UID: \"cad60853-6c2f-411f-935c-f5890843bbf1\") " pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:22.972354 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:22.972330 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:22.972354 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:22.972350 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:22.972463 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:22.972363 2574 projected.go:194] Error preparing data for projected volume kube-api-access-x24dj for pod openshift-network-diagnostics/network-check-target-gwj7d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:22.972463 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:22.972423 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj podName:cad60853-6c2f-411f-935c-f5890843bbf1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:24.972402533 +0000 UTC m=+6.195730302 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x24dj" (UniqueName: "kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj") pod "network-check-target-gwj7d" (UID: "cad60853-6c2f-411f-935c-f5890843bbf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:23.378365 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:23.377885 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:23.378365 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:23.378017 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:23.417141 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:23.417066 2574 generic.go:358] "Generic (PLEG): container finished" podID="d9666662d7a9a022fd7fae8af93e7302" containerID="151f556d999bffb1ce91eafe44a78493764fefe3a59a60fa28ba2ad74bfe24f7" exitCode=0 Apr 17 16:31:23.417582 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:23.417552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" event={"ID":"d9666662d7a9a022fd7fae8af93e7302","Type":"ContainerDied","Data":"151f556d999bffb1ce91eafe44a78493764fefe3a59a60fa28ba2ad74bfe24f7"} Apr 17 16:31:24.377825 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:24.377322 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:24.377825 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:24.377462 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:24.443488 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:24.443455 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" event={"ID":"d9666662d7a9a022fd7fae8af93e7302","Type":"ContainerStarted","Data":"97a49318938d4d61cd702eccc742b21881748434e7f8e159e1fed11887c5373a"} Apr 17 16:31:24.890785 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:24.890743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:24.890997 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:24.890979 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:24.891074 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:24.891052 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs podName:b57612e9-f335-4d71-bdba-f06f0735eee1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:28.891033671 +0000 UTC m=+10.114361441 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs") pod "network-metrics-daemon-42tgv" (UID: "b57612e9-f335-4d71-bdba-f06f0735eee1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:24.991213 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:24.991167 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x24dj\" (UniqueName: \"kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj\") pod \"network-check-target-gwj7d\" (UID: \"cad60853-6c2f-411f-935c-f5890843bbf1\") " pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:24.991418 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:24.991398 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:24.991492 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:24.991427 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:24.991492 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:24.991441 2574 projected.go:194] Error preparing data for projected volume kube-api-access-x24dj for pod openshift-network-diagnostics/network-check-target-gwj7d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:24.991597 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:24.991501 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj podName:cad60853-6c2f-411f-935c-f5890843bbf1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:28.991482369 +0000 UTC m=+10.214810128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x24dj" (UniqueName: "kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj") pod "network-check-target-gwj7d" (UID: "cad60853-6c2f-411f-935c-f5890843bbf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:25.379077 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:25.378959 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:25.379420 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:25.379098 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:26.377451 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:26.377415 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:26.377911 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:26.377561 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:27.377937 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:27.377907 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:27.378380 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:27.378040 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:28.377749 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:28.377710 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:28.377910 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:28.377857 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:28.922393 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:28.922306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:28.922885 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:28.922431 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:28.922885 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:28.922513 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs podName:b57612e9-f335-4d71-bdba-f06f0735eee1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:36.922492076 +0000 UTC m=+18.145819835 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs") pod "network-metrics-daemon-42tgv" (UID: "b57612e9-f335-4d71-bdba-f06f0735eee1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:29.023630 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.023592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x24dj\" (UniqueName: \"kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj\") pod \"network-check-target-gwj7d\" (UID: \"cad60853-6c2f-411f-935c-f5890843bbf1\") " pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:29.023830 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:29.023811 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:29.023906 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:29.023837 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:29.023906 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:29.023850 2574 projected.go:194] Error preparing data for projected volume kube-api-access-x24dj for pod openshift-network-diagnostics/network-check-target-gwj7d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:29.024003 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:29.023911 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj podName:cad60853-6c2f-411f-935c-f5890843bbf1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:37.023892704 +0000 UTC m=+18.247220471 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x24dj" (UniqueName: "kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj") pod "network-check-target-gwj7d" (UID: "cad60853-6c2f-411f-935c-f5890843bbf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:29.339386 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.339021 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-144.ec2.internal" podStartSLOduration=9.339001683 podStartE2EDuration="9.339001683s" podCreationTimestamp="2026-04-17 16:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:31:24.45796987 +0000 UTC m=+5.681297646" watchObservedRunningTime="2026-04-17 16:31:29.339001683 +0000 UTC m=+10.562329464" Apr 17 16:31:29.339559 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.339452 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5phzj"] Apr 17 16:31:29.342308 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.341955 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:29.342308 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:29.342025 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:29.377790 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.377761 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:29.377917 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:29.377882 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:29.428315 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.428282 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:29.428471 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.428384 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-kubelet-config\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:29.428536 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.428478 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-dbus\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:29.529337 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.528991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:29.529337 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.529045 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-kubelet-config\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:29.529337 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.529096 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-dbus\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:29.529337 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:29.529188 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:29.529337 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.529217 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-kubelet-config\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:29.529337 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:29.529251 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret podName:a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:30.029234553 +0000 UTC m=+11.252562317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret") pod "global-pull-secret-syncer-5phzj" (UID: "a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:29.529337 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:29.529301 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-dbus\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:30.033840 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:30.033753 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:30.034252 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:30.033954 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:30.034252 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:30.034017 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret podName:a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:31.033999009 +0000 UTC m=+12.257326788 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret") pod "global-pull-secret-syncer-5phzj" (UID: "a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:30.377980 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:30.377404 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:30.377980 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:30.377556 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:31.040435 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:31.040397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:31.040908 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:31.040540 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:31.040908 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:31.040601 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret podName:a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:33.040581053 +0000 UTC m=+14.263908826 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret") pod "global-pull-secret-syncer-5phzj" (UID: "a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:31.378133 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:31.378040 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:31.378133 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:31.378082 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:31.378332 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:31.378178 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:31.378332 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:31.378276 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:32.377924 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:32.377887 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:32.378398 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:32.378028 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:33.054431 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:33.054391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:33.054597 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:33.054522 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:33.054597 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:33.054582 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret podName:a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:37.054563665 +0000 UTC m=+18.277891425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret") pod "global-pull-secret-syncer-5phzj" (UID: "a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:33.378084 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:33.377996 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:33.378482 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:33.377996 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:33.378482 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:33.378122 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:33.378482 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:33.378227 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:34.377089 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:34.377063 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:34.377250 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:34.377160 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:35.377045 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:35.377007 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:35.377509 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:35.377137 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:35.377509 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:35.377213 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:35.377509 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:35.377299 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:36.377807 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:36.377774 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:36.378246 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:36.377901 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:36.984119 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:36.984078 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:36.984378 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:36.984223 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:36.984378 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:36.984294 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs podName:b57612e9-f335-4d71-bdba-f06f0735eee1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:52.984274286 +0000 UTC m=+34.207602042 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs") pod "network-metrics-daemon-42tgv" (UID: "b57612e9-f335-4d71-bdba-f06f0735eee1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:37.085427 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:37.085389 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:37.085601 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:37.085457 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x24dj\" (UniqueName: \"kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj\") pod \"network-check-target-gwj7d\" (UID: \"cad60853-6c2f-411f-935c-f5890843bbf1\") " pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:37.085601 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:37.085569 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:37.085601 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:37.085585 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:37.085601 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:37.085594 2574 projected.go:194] Error preparing data for projected volume kube-api-access-x24dj for pod openshift-network-diagnostics/network-check-target-gwj7d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:37.085800 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:37.085637 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj podName:cad60853-6c2f-411f-935c-f5890843bbf1 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:53.085625168 +0000 UTC m=+34.308952923 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x24dj" (UniqueName: "kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj") pod "network-check-target-gwj7d" (UID: "cad60853-6c2f-411f-935c-f5890843bbf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:37.085800 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:37.085568 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:37.085800 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:37.085719 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret podName:a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c nodeName:}" failed. No retries permitted until 2026-04-17 16:31:45.085697853 +0000 UTC m=+26.309025615 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret") pod "global-pull-secret-syncer-5phzj" (UID: "a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:37.377634 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:37.377555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:37.377634 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:37.377607 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:37.377842 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:37.377689 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:37.378161 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:37.377842 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:38.377034 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:38.377003 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:38.377211 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:38.377115 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:39.379432 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:39.379289 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:39.379802 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:39.379375 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:39.379802 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:39.379551 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:39.379802 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:39.379617 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:40.377078 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.377051 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:40.377196 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:40.377175 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:40.482946 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.482922 2574 generic.go:358] "Generic (PLEG): container finished" podID="2a3eeab2-52f2-4ba5-a534-bef4430448f8" containerID="48e217fe021ecf04b675c2e38815c9935e28706bf14752bfa478fd1b67b95de9" exitCode=0 Apr 17 16:31:40.483682 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.483006 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" event={"ID":"2a3eeab2-52f2-4ba5-a534-bef4430448f8","Type":"ContainerDied","Data":"48e217fe021ecf04b675c2e38815c9935e28706bf14752bfa478fd1b67b95de9"} Apr 17 16:31:40.484538 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.484514 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xztmj" event={"ID":"b67a3101-cd16-466d-bb65-1fcd6158a8f4","Type":"ContainerStarted","Data":"3e05024ab5145cec8fe6b349ecd106639cce4d41f6ad2741f393861c9d10d6cc"} Apr 17 16:31:40.485719 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.485605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" event={"ID":"e4399b5a-cfeb-4136-9844-e59d92f13af1","Type":"ContainerStarted","Data":"1e38dffe6d5c01fd08eb89732269cd9879e5a8e41dd051ecaa2812b6788d8826"} Apr 17 16:31:40.486803 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.486785 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6jdwh" event={"ID":"e613ab88-5ce8-4dda-a30c-57006804fdb0","Type":"ContainerStarted","Data":"bde5dc0b2dec0f2c02200fdfa8028af10b524d92475d0ece4607443468ac4283"} Apr 17 16:31:40.488133 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.488114 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" event={"ID":"390c549d-4a2a-4d76-8395-5efa4f2e8a4e","Type":"ContainerStarted","Data":"728581156cfd40c0b8b70580749b17b1bfd78d290275ac92f8ca7a71365d8117"} Apr 17 16:31:40.489620 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.489596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t7lqr" event={"ID":"1a680fa9-f376-482e-af97-a722bc5b37c6","Type":"ContainerStarted","Data":"b8baaabb3960cda30b7f03ea7645b513d8d08979a23afa7908075bf83a577794"} Apr 17 16:31:40.490809 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.490788 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-pbmrc" event={"ID":"ec869d90-6e5f-4329-9d3c-62938cb140e5","Type":"ContainerStarted","Data":"6d76a972635e23364a63df6e4d365a35f56e0d5430457e01315a9aec053cc067"} Apr 17 16:31:40.493039 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.493022 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:31:40.493287 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.493269 2574 generic.go:358] "Generic (PLEG): container finished" podID="677ce4c5-2494-4409-bb8c-263a71ca26d1" containerID="42d0ecfacef4c35d553826c2a62d5da9039a6ce4cfef9e99d6f6efa955880f6e" exitCode=1 Apr 17 16:31:40.493367 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.493294 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" event={"ID":"677ce4c5-2494-4409-bb8c-263a71ca26d1","Type":"ContainerStarted","Data":"467382d61edf63186c698f9f013604bd597cbc116b12327df38a48504688644e"} Apr 17 16:31:40.493367 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.493307 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" event={"ID":"677ce4c5-2494-4409-bb8c-263a71ca26d1","Type":"ContainerStarted","Data":"365c45e0634c63372cf970067a30892113be3874c513308950929234bfd9e595"} Apr 17 16:31:40.493367 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.493317 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" event={"ID":"677ce4c5-2494-4409-bb8c-263a71ca26d1","Type":"ContainerStarted","Data":"88821f57a06751cac1d3952b070c99643c692d0b6b830ae914500bede1487404"} Apr 17 16:31:40.493367 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.493328 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" event={"ID":"677ce4c5-2494-4409-bb8c-263a71ca26d1","Type":"ContainerStarted","Data":"6943f97203a239db7d59965809d1afba4824df92b4f5e8aa48b0d1c2e349cc31"} Apr 17 16:31:40.493367 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.493338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" event={"ID":"677ce4c5-2494-4409-bb8c-263a71ca26d1","Type":"ContainerDied","Data":"42d0ecfacef4c35d553826c2a62d5da9039a6ce4cfef9e99d6f6efa955880f6e"} Apr 17 16:31:40.493367 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.493346 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" event={"ID":"677ce4c5-2494-4409-bb8c-263a71ca26d1","Type":"ContainerStarted","Data":"4570e105c6c6e3af4ec59bea32535ec5d156c6c81ca79cfa4ef35515b6ae2995"} Apr 17 16:31:40.524941 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.524891 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6jdwh" podStartSLOduration=9.125878573 podStartE2EDuration="21.524876898s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:22.223261933 +0000 UTC m=+3.446589689" lastFinishedPulling="2026-04-17 16:31:34.622260243 +0000 UTC m=+15.845588014" observedRunningTime="2026-04-17 16:31:40.524427067 +0000 UTC m=+21.747754865" watchObservedRunningTime="2026-04-17 16:31:40.524876898 +0000 UTC m=+21.748204677" Apr 17 16:31:40.540355 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.540307 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-pbmrc" podStartSLOduration=4.810560667 podStartE2EDuration="21.540280652s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:22.225185217 +0000 UTC m=+3.448512985" lastFinishedPulling="2026-04-17 16:31:38.954905188 +0000 UTC m=+20.178232970" observedRunningTime="2026-04-17 16:31:40.540255995 +0000 UTC m=+21.763583763" watchObservedRunningTime="2026-04-17 16:31:40.540280652 +0000 UTC m=+21.763608428" Apr 17 16:31:40.558809 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.558764 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xztmj" podStartSLOduration=4.835013415 podStartE2EDuration="21.558748913s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:22.231185586 +0000 UTC m=+3.454513357" lastFinishedPulling="2026-04-17 16:31:38.954921082 +0000 UTC m=+20.178248855" observedRunningTime="2026-04-17 16:31:40.556867227 +0000 UTC m=+21.780194999" watchObservedRunningTime="2026-04-17 16:31:40.558748913 +0000 UTC m=+21.782076682" Apr 17 16:31:40.592619 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.592574 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4nbnb" podStartSLOduration=4.480185197 podStartE2EDuration="21.592559403s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:22.228548972 +0000 UTC m=+3.451876728" lastFinishedPulling="2026-04-17 16:31:39.340923179 +0000 UTC m=+20.564250934" observedRunningTime="2026-04-17 16:31:40.573992145 +0000 UTC m=+21.797319922" watchObservedRunningTime="2026-04-17 16:31:40.592559403 +0000 UTC m=+21.815887182" Apr 17 16:31:40.592753 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.592704 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-t7lqr" podStartSLOduration=4.469953561 podStartE2EDuration="21.592698703s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:22.225614908 +0000 UTC m=+3.448942663" lastFinishedPulling="2026-04-17 16:31:39.348360047 +0000 UTC m=+20.571687805" observedRunningTime="2026-04-17 16:31:40.592465722 +0000 UTC m=+21.815793499" watchObservedRunningTime="2026-04-17 16:31:40.592698703 +0000 UTC m=+21.816026480" Apr 17 16:31:40.627796 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:40.627662 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:31:41.306103 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:41.305958 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:31:40.627793284Z","UUID":"9c29f16d-e747-4e50-9c6b-18558e4607cc","Handler":null,"Name":"","Endpoint":""} Apr 17 16:31:41.308350 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:41.308330 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:31:41.308476 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:41.308356 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:31:41.377831 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:41.377802 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:41.377967 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:41.377919 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:41.377967 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:41.377920 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:41.378086 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:41.377994 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:41.497439 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:41.497408 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" event={"ID":"390c549d-4a2a-4d76-8395-5efa4f2e8a4e","Type":"ContainerStarted","Data":"a7628479574a8d9a6fc4e835bb9ae566d2a84710b0f2c70f76d8b53bdd388092"} Apr 17 16:31:41.498818 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:41.498793 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tm6mx" event={"ID":"318876c9-9879-4dc1-a7cb-9664f3196aa9","Type":"ContainerStarted","Data":"9fea1a66a8971d2c04295ac0269bb930647a26c91a3a6593246e0be1ce668e58"} Apr 17 16:31:41.514172 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:41.514124 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tm6mx" podStartSLOduration=5.43912302 podStartE2EDuration="22.514107339s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:22.233691124 +0000 UTC m=+3.457018882" lastFinishedPulling="2026-04-17 16:31:39.308675441 +0000 UTC m=+20.532003201" observedRunningTime="2026-04-17 16:31:41.513592023 +0000 UTC m=+22.736919803" watchObservedRunningTime="2026-04-17 16:31:41.514107339 +0000 UTC m=+22.737435118" Apr 17 16:31:42.378028 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:42.377983 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:42.378192 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:42.378133 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:42.503568 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:42.503524 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" event={"ID":"390c549d-4a2a-4d76-8395-5efa4f2e8a4e","Type":"ContainerStarted","Data":"ec8042aea678dd9ea316611b94631642770b00dde54de758d287101d93c9b0f5"} Apr 17 16:31:42.522134 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:42.522094 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wkbvw" podStartSLOduration=4.0907142 podStartE2EDuration="23.522079808s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:22.229853025 +0000 UTC m=+3.453180799" lastFinishedPulling="2026-04-17 16:31:41.661218649 +0000 UTC m=+22.884546407" observedRunningTime="2026-04-17 16:31:42.521651369 +0000 UTC m=+23.744979147" watchObservedRunningTime="2026-04-17 16:31:42.522079808 +0000 UTC m=+23.745407583" Apr 17 16:31:42.970819 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:42.970781 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:42.971698 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:42.971676 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:43.377049 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:43.377010 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:43.377231 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:43.377017 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:43.377231 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:43.377140 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:43.377352 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:43.377224 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:43.508233 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:43.508204 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:31:43.508708 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:43.508634 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" event={"ID":"677ce4c5-2494-4409-bb8c-263a71ca26d1","Type":"ContainerStarted","Data":"6264a8072aee3ef32e51120951b4b6e6272ba4362f7d06b4c6a62ef18af16851"} Apr 17 16:31:43.508862 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:43.508836 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:43.509445 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:43.509426 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xztmj" Apr 17 16:31:44.377796 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:44.377766 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:44.377943 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:44.377877 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:45.147114 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.146930 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:45.147522 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:45.147065 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:45.147522 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:45.147179 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret podName:a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c nodeName:}" failed. No retries permitted until 2026-04-17 16:32:01.147164415 +0000 UTC m=+42.370492171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret") pod "global-pull-secret-syncer-5phzj" (UID: "a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:31:45.377141 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.377109 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:45.377141 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.377136 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:45.377344 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:45.377202 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:45.377344 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:45.377263 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:45.514989 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.514968 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:31:45.515286 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.515257 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" event={"ID":"677ce4c5-2494-4409-bb8c-263a71ca26d1","Type":"ContainerStarted","Data":"0802a1044da0f1761938f6582ec69cc6887fb9c3fd61d4ed34caf5619a2dbaab"} Apr 17 16:31:45.515604 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.515572 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:45.515604 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.515597 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:45.515713 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.515610 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:45.515824 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.515806 2574 scope.go:117] "RemoveContainer" containerID="42d0ecfacef4c35d553826c2a62d5da9039a6ce4cfef9e99d6f6efa955880f6e" Apr 17 16:31:45.517028 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.517006 2574 generic.go:358] "Generic (PLEG): container finished" podID="2a3eeab2-52f2-4ba5-a534-bef4430448f8" containerID="f9aa3f967630118867777ee5f8b834a1573670596935119bba7d1da85bb61dd2" exitCode=0 Apr 17 16:31:45.517137 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.517095 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" event={"ID":"2a3eeab2-52f2-4ba5-a534-bef4430448f8","Type":"ContainerDied","Data":"f9aa3f967630118867777ee5f8b834a1573670596935119bba7d1da85bb61dd2"} Apr 17 16:31:45.530803 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.530773 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:45.530984 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:45.530971 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:31:46.377658 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.377631 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:46.378012 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:46.377790 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:46.522505 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.522359 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:31:46.522843 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.522813 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" event={"ID":"677ce4c5-2494-4409-bb8c-263a71ca26d1","Type":"ContainerStarted","Data":"decb06ef3d38c9d85ee938da81da993aa0052f04783cec95453998ec3056cb99"} Apr 17 16:31:46.524517 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.524497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" event={"ID":"2a3eeab2-52f2-4ba5-a534-bef4430448f8","Type":"ContainerStarted","Data":"7cc440be9d4b10bd6bfa74af28e4566302d783061a72646d191cca805b3b34da"} Apr 17 16:31:46.563492 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.563451 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" podStartSLOduration=10.385837232 podStartE2EDuration="27.563440043s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:22.222502738 +0000 UTC m=+3.445830494" lastFinishedPulling="2026-04-17 16:31:39.400105549 +0000 UTC m=+20.623433305" observedRunningTime="2026-04-17 16:31:46.563273372 +0000 UTC m=+27.786601150" watchObservedRunningTime="2026-04-17 16:31:46.563440043 +0000 UTC m=+27.786767821" Apr 17 16:31:46.629749 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.629671 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5phzj"] Apr 17 16:31:46.629867 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.629792 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:46.629933 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:46.629866 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:46.632672 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.632643 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gwj7d"] Apr 17 16:31:46.632790 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.632774 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:46.632890 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:46.632868 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:46.633211 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.633193 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-42tgv"] Apr 17 16:31:46.633274 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:46.633264 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:46.633374 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:46.633355 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:47.527969 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:47.527939 2574 generic.go:358] "Generic (PLEG): container finished" podID="2a3eeab2-52f2-4ba5-a534-bef4430448f8" containerID="7cc440be9d4b10bd6bfa74af28e4566302d783061a72646d191cca805b3b34da" exitCode=0 Apr 17 16:31:47.527969 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:47.527962 2574 generic.go:358] "Generic (PLEG): container finished" podID="2a3eeab2-52f2-4ba5-a534-bef4430448f8" containerID="155f0bb1b9ff75b82da017e2f1aeb35273cb7c1990ffb5158f5695057dedda48" exitCode=0 Apr 17 16:31:47.528551 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:47.528028 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" event={"ID":"2a3eeab2-52f2-4ba5-a534-bef4430448f8","Type":"ContainerDied","Data":"7cc440be9d4b10bd6bfa74af28e4566302d783061a72646d191cca805b3b34da"} Apr 17 16:31:47.528551 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:47.528061 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" event={"ID":"2a3eeab2-52f2-4ba5-a534-bef4430448f8","Type":"ContainerDied","Data":"155f0bb1b9ff75b82da017e2f1aeb35273cb7c1990ffb5158f5695057dedda48"} Apr 17 16:31:48.377183 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:48.377147 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:48.377393 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:48.377148 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:48.377393 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:48.377279 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:48.377393 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:48.377148 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:48.377393 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:48.377353 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:48.377598 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:48.377441 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:50.377453 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:50.377420 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:50.378171 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:50.377420 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:50.378171 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:50.377537 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:50.378171 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:50.377421 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:50.378171 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:50.377640 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:50.378171 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:50.377710 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:52.378011 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.377562 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:52.378011 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.377834 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:52.378011 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:52.377949 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:31:52.378011 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.378031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:52.378776 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:52.378112 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-gwj7d" podUID="cad60853-6c2f-411f-935c-f5890843bbf1" Apr 17 16:31:52.378776 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:52.378179 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-5phzj" podUID="a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c" Apr 17 16:31:52.646952 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.646840 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-144.ec2.internal" event="NodeReady" Apr 17 16:31:52.647127 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.646998 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:31:52.691284 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.691252 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vpvzg"] Apr 17 16:31:52.696246 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.696222 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jvmkj"] Apr 17 16:31:52.696408 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.696390 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.699692 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.699673 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:31:52.699861 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.699819 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dbwrt\"" Apr 17 16:31:52.699980 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.699942 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:31:52.700180 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.699946 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:31:52.702350 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.702217 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:31:52.702350 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.702290 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:31:52.702504 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.702434 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:31:52.702504 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.702468 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ttskh\"" Apr 17 16:31:52.704608 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.704572 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vpvzg"] Apr 17 16:31:52.717930 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.717907 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jvmkj"] Apr 17 16:31:52.809228 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.809197 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06db4982-9078-40f4-a267-e270f44de092-config-volume\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.809228 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.809237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dcz5\" (UniqueName: \"kubernetes.io/projected/06db4982-9078-40f4-a267-e270f44de092-kube-api-access-9dcz5\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.809460 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.809261 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:31:52.809460 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.809298 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbsxw\" (UniqueName: \"kubernetes.io/projected/34b1d336-8626-43f1-8ced-2764a72b207b-kube-api-access-rbsxw\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:31:52.809460 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.809376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/06db4982-9078-40f4-a267-e270f44de092-tmp-dir\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.809460 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.809410 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.910576 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.910506 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/06db4982-9078-40f4-a267-e270f44de092-tmp-dir\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.910576 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.910553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.910758 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.910592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06db4982-9078-40f4-a267-e270f44de092-config-volume\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.910758 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.910609 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dcz5\" (UniqueName: \"kubernetes.io/projected/06db4982-9078-40f4-a267-e270f44de092-kube-api-access-9dcz5\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.910758 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.910624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:31:52.910758 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.910643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbsxw\" (UniqueName: \"kubernetes.io/projected/34b1d336-8626-43f1-8ced-2764a72b207b-kube-api-access-rbsxw\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:31:52.910961 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:52.910768 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:52.910961 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:52.910798 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:52.910961 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:52.910844 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls podName:06db4982-9078-40f4-a267-e270f44de092 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:53.410821498 +0000 UTC m=+34.634149272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls") pod "dns-default-vpvzg" (UID: "06db4982-9078-40f4-a267-e270f44de092") : secret "dns-default-metrics-tls" not found Apr 17 16:31:52.910961 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:52.910864 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert podName:34b1d336-8626-43f1-8ced-2764a72b207b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:53.410854567 +0000 UTC m=+34.634182322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert") pod "ingress-canary-jvmkj" (UID: "34b1d336-8626-43f1-8ced-2764a72b207b") : secret "canary-serving-cert" not found Apr 17 16:31:52.910961 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.910842 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/06db4982-9078-40f4-a267-e270f44de092-tmp-dir\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.911242 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.911220 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06db4982-9078-40f4-a267-e270f44de092-config-volume\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.923794 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.923768 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dcz5\" (UniqueName: \"kubernetes.io/projected/06db4982-9078-40f4-a267-e270f44de092-kube-api-access-9dcz5\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:52.923922 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:52.923855 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbsxw\" (UniqueName: \"kubernetes.io/projected/34b1d336-8626-43f1-8ced-2764a72b207b-kube-api-access-rbsxw\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:31:53.011958 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:53.011922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:53.012105 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:53.012065 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:53.012143 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:53.012133 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs podName:b57612e9-f335-4d71-bdba-f06f0735eee1 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:25.01211606 +0000 UTC m=+66.235443816 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs") pod "network-metrics-daemon-42tgv" (UID: "b57612e9-f335-4d71-bdba-f06f0735eee1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:31:53.112704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:53.112671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x24dj\" (UniqueName: \"kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj\") pod \"network-check-target-gwj7d\" (UID: \"cad60853-6c2f-411f-935c-f5890843bbf1\") " pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:53.112850 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:53.112839 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:31:53.112887 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:53.112855 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:31:53.112887 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:53.112863 2574 projected.go:194] Error preparing data for projected volume kube-api-access-x24dj for pod openshift-network-diagnostics/network-check-target-gwj7d: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:53.112948 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:53.112911 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj podName:cad60853-6c2f-411f-935c-f5890843bbf1 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:25.112898567 +0000 UTC m=+66.336226322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-x24dj" (UniqueName: "kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj") pod "network-check-target-gwj7d" (UID: "cad60853-6c2f-411f-935c-f5890843bbf1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:31:53.418431 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:53.418334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:53.418431 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:53.418396 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:31:53.418961 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:53.418482 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:53.418961 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:53.418547 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls podName:06db4982-9078-40f4-a267-e270f44de092 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.418532521 +0000 UTC m=+35.641860277 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls") pod "dns-default-vpvzg" (UID: "06db4982-9078-40f4-a267-e270f44de092") : secret "dns-default-metrics-tls" not found Apr 17 16:31:53.418961 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:53.418489 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:53.418961 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:53.418639 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert podName:34b1d336-8626-43f1-8ced-2764a72b207b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:54.418622613 +0000 UTC m=+35.641950371 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert") pod "ingress-canary-jvmkj" (UID: "34b1d336-8626-43f1-8ced-2764a72b207b") : secret "canary-serving-cert" not found Apr 17 16:31:53.544584 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:53.544554 2574 generic.go:358] "Generic (PLEG): container finished" podID="2a3eeab2-52f2-4ba5-a534-bef4430448f8" containerID="159e64679d846d1dac33c89f3e874b99464c527e86172ca6d5ac09bfb5eee62a" exitCode=0 Apr 17 16:31:53.544769 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:53.544612 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" event={"ID":"2a3eeab2-52f2-4ba5-a534-bef4430448f8","Type":"ContainerDied","Data":"159e64679d846d1dac33c89f3e874b99464c527e86172ca6d5ac09bfb5eee62a"} Apr 17 16:31:54.377687 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.377642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:31:54.377905 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.377642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:31:54.377905 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.377642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:31:54.383621 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.383599 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:31:54.383908 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.383778 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xlgkd\"" Apr 17 16:31:54.384943 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.384921 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-26v9q\"" Apr 17 16:31:54.385049 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.384949 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:31:54.385049 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.384957 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:31:54.385049 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.384929 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:31:54.426715 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.426682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:31:54.427094 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.426774 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:54.427094 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:54.426854 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:54.427094 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:54.426868 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:54.427094 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:54.426923 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls podName:06db4982-9078-40f4-a267-e270f44de092 nodeName:}" failed. No retries permitted until 2026-04-17 16:31:56.426904036 +0000 UTC m=+37.650231794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls") pod "dns-default-vpvzg" (UID: "06db4982-9078-40f4-a267-e270f44de092") : secret "dns-default-metrics-tls" not found Apr 17 16:31:54.427094 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:54.426940 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert podName:34b1d336-8626-43f1-8ced-2764a72b207b nodeName:}" failed. No retries permitted until 2026-04-17 16:31:56.426931856 +0000 UTC m=+37.650259612 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert") pod "ingress-canary-jvmkj" (UID: "34b1d336-8626-43f1-8ced-2764a72b207b") : secret "canary-serving-cert" not found Apr 17 16:31:54.548621 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.548582 2574 generic.go:358] "Generic (PLEG): container finished" podID="2a3eeab2-52f2-4ba5-a534-bef4430448f8" containerID="2a32df0103b7af6225fa6d0d8710bc6613d55a7f105a29b72a175ee93c264f61" exitCode=0 Apr 17 16:31:54.548787 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:54.548643 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" event={"ID":"2a3eeab2-52f2-4ba5-a534-bef4430448f8","Type":"ContainerDied","Data":"2a32df0103b7af6225fa6d0d8710bc6613d55a7f105a29b72a175ee93c264f61"} Apr 17 16:31:55.552644 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:55.552617 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" event={"ID":"2a3eeab2-52f2-4ba5-a534-bef4430448f8","Type":"ContainerStarted","Data":"bc7cf806d7da0bd9895f0f91c3eb2bb7c4303d505c421ae93234cce601f8911f"} Apr 17 16:31:55.579569 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:55.578393 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6m4wt" podStartSLOduration=5.67216699 podStartE2EDuration="36.578374779s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:31:22.232186799 +0000 UTC m=+3.455514554" lastFinishedPulling="2026-04-17 16:31:53.138394585 +0000 UTC m=+34.361722343" observedRunningTime="2026-04-17 16:31:55.575608807 +0000 UTC m=+36.798936582" watchObservedRunningTime="2026-04-17 16:31:55.578374779 +0000 UTC m=+36.801702562" Apr 17 16:31:56.443494 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:56.443454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:31:56.443666 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:31:56.443515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:31:56.443666 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:56.443604 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:31:56.443666 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:56.443609 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:31:56.443666 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:56.443658 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert podName:34b1d336-8626-43f1-8ced-2764a72b207b nodeName:}" failed. No retries permitted until 2026-04-17 16:32:00.443644643 +0000 UTC m=+41.666972399 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert") pod "ingress-canary-jvmkj" (UID: "34b1d336-8626-43f1-8ced-2764a72b207b") : secret "canary-serving-cert" not found Apr 17 16:31:56.443830 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:31:56.443671 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls podName:06db4982-9078-40f4-a267-e270f44de092 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:00.443665502 +0000 UTC m=+41.666993258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls") pod "dns-default-vpvzg" (UID: "06db4982-9078-40f4-a267-e270f44de092") : secret "dns-default-metrics-tls" not found Apr 17 16:32:00.471753 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:00.471706 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:32:00.472159 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:00.471778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:32:00.472159 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:00.471872 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:00.472159 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:00.471880 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:00.472159 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:00.471921 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert podName:34b1d336-8626-43f1-8ced-2764a72b207b nodeName:}" failed. No retries permitted until 2026-04-17 16:32:08.471908201 +0000 UTC m=+49.695235956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert") pod "ingress-canary-jvmkj" (UID: "34b1d336-8626-43f1-8ced-2764a72b207b") : secret "canary-serving-cert" not found Apr 17 16:32:00.472159 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:00.471944 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls podName:06db4982-9078-40f4-a267-e270f44de092 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:08.471927987 +0000 UTC m=+49.695255743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls") pod "dns-default-vpvzg" (UID: "06db4982-9078-40f4-a267-e270f44de092") : secret "dns-default-metrics-tls" not found Apr 17 16:32:01.175771 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:01.175721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:32:01.178877 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:01.178850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c-original-pull-secret\") pod \"global-pull-secret-syncer-5phzj\" (UID: \"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c\") " pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:32:01.297043 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:01.297007 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5phzj" Apr 17 16:32:01.427595 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:01.427526 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5phzj"] Apr 17 16:32:01.432792 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:32:01.431970 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda437f6b6_ca4b_4b4f_b5b1_5f126c2fac1c.slice/crio-3630b36eb236c1a81093975f081c24607f6b296867152545997e9e874db5fe5e WatchSource:0}: Error finding container 3630b36eb236c1a81093975f081c24607f6b296867152545997e9e874db5fe5e: Status 404 returned error can't find the container with id 3630b36eb236c1a81093975f081c24607f6b296867152545997e9e874db5fe5e Apr 17 16:32:01.564117 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:01.564079 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5phzj" event={"ID":"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c","Type":"ContainerStarted","Data":"3630b36eb236c1a81093975f081c24607f6b296867152545997e9e874db5fe5e"} Apr 17 16:32:05.573784 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:05.573741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5phzj" event={"ID":"a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c","Type":"ContainerStarted","Data":"97ed27a4afdb40fd859552992e2e7a7bfff75561c5280cef208f9b09086dce2a"} Apr 17 16:32:05.588247 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:05.588104 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5phzj" podStartSLOduration=32.67644562 podStartE2EDuration="36.588084364s" podCreationTimestamp="2026-04-17 16:31:29 +0000 UTC" firstStartedPulling="2026-04-17 16:32:01.436338974 +0000 UTC m=+42.659666736" lastFinishedPulling="2026-04-17 16:32:05.347977721 +0000 UTC m=+46.571305480" observedRunningTime="2026-04-17 16:32:05.587654801 +0000 UTC m=+46.810982579" watchObservedRunningTime="2026-04-17 16:32:05.588084364 +0000 UTC m=+46.811412143" Apr 17 16:32:08.533245 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:08.533208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:32:08.533697 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:08.533262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:32:08.533697 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:08.533369 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:08.533697 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:08.533425 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert podName:34b1d336-8626-43f1-8ced-2764a72b207b nodeName:}" failed. No retries permitted until 2026-04-17 16:32:24.533410865 +0000 UTC m=+65.756738621 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert") pod "ingress-canary-jvmkj" (UID: "34b1d336-8626-43f1-8ced-2764a72b207b") : secret "canary-serving-cert" not found Apr 17 16:32:08.533697 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:08.533369 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:08.533697 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:08.533522 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls podName:06db4982-9078-40f4-a267-e270f44de092 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:24.533508326 +0000 UTC m=+65.756836095 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls") pod "dns-default-vpvzg" (UID: "06db4982-9078-40f4-a267-e270f44de092") : secret "dns-default-metrics-tls" not found Apr 17 16:32:17.538652 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:17.538622 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bv26r" Apr 17 16:32:24.545053 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:24.545022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:32:24.545558 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:24.545067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:32:24.545558 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:24.545177 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:24.545558 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:24.545253 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls podName:06db4982-9078-40f4-a267-e270f44de092 nodeName:}" failed. No retries permitted until 2026-04-17 16:32:56.545235423 +0000 UTC m=+97.768563178 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls") pod "dns-default-vpvzg" (UID: "06db4982-9078-40f4-a267-e270f44de092") : secret "dns-default-metrics-tls" not found Apr 17 16:32:24.545558 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:24.545260 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:24.545558 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:24.545320 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert podName:34b1d336-8626-43f1-8ced-2764a72b207b nodeName:}" failed. No retries permitted until 2026-04-17 16:32:56.545302394 +0000 UTC m=+97.768630155 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert") pod "ingress-canary-jvmkj" (UID: "34b1d336-8626-43f1-8ced-2764a72b207b") : secret "canary-serving-cert" not found Apr 17 16:32:25.048210 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:25.048175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:32:25.050849 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:25.050831 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:32:25.058933 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:25.058919 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:32:25.058990 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:25.058981 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs podName:b57612e9-f335-4d71-bdba-f06f0735eee1 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:29.058965083 +0000 UTC m=+130.282292839 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs") pod "network-metrics-daemon-42tgv" (UID: "b57612e9-f335-4d71-bdba-f06f0735eee1") : secret "metrics-daemon-secret" not found Apr 17 16:32:25.148853 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:25.148809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x24dj\" (UniqueName: \"kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj\") pod \"network-check-target-gwj7d\" (UID: \"cad60853-6c2f-411f-935c-f5890843bbf1\") " pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:32:25.151673 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:25.151657 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:32:25.161598 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:25.161582 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:32:25.172767 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:25.172724 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x24dj\" (UniqueName: \"kubernetes.io/projected/cad60853-6c2f-411f-935c-f5890843bbf1-kube-api-access-x24dj\") pod \"network-check-target-gwj7d\" (UID: \"cad60853-6c2f-411f-935c-f5890843bbf1\") " pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:32:25.295213 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:25.295182 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-26v9q\"" Apr 17 16:32:25.302455 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:25.302400 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:32:25.440255 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:25.440215 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-gwj7d"] Apr 17 16:32:25.443360 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:32:25.443332 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad60853_6c2f_411f_935c_f5890843bbf1.slice/crio-104aa391cf5ecdb3b0f8fcbb14f62e9ceef087283a51abdbb3f627bc28704260 WatchSource:0}: Error finding container 104aa391cf5ecdb3b0f8fcbb14f62e9ceef087283a51abdbb3f627bc28704260: Status 404 returned error can't find the container with id 104aa391cf5ecdb3b0f8fcbb14f62e9ceef087283a51abdbb3f627bc28704260 Apr 17 16:32:25.611228 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:25.611158 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gwj7d" event={"ID":"cad60853-6c2f-411f-935c-f5890843bbf1","Type":"ContainerStarted","Data":"104aa391cf5ecdb3b0f8fcbb14f62e9ceef087283a51abdbb3f627bc28704260"} Apr 17 16:32:28.617438 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:28.617411 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-gwj7d" event={"ID":"cad60853-6c2f-411f-935c-f5890843bbf1","Type":"ContainerStarted","Data":"e9600050283aff120d37d88b3cf38f94cb78f2cac1b697f049af41e28f88d16a"} Apr 17 16:32:28.617824 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:28.617548 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:32:28.632226 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:28.632181 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-gwj7d" podStartSLOduration=66.540130088 podStartE2EDuration="1m9.632166346s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:32:25.445271782 +0000 UTC m=+66.668599538" lastFinishedPulling="2026-04-17 16:32:28.53730804 +0000 UTC m=+69.760635796" observedRunningTime="2026-04-17 16:32:28.63216316 +0000 UTC m=+69.855490941" watchObservedRunningTime="2026-04-17 16:32:28.632166346 +0000 UTC m=+69.855494101" Apr 17 16:32:56.560683 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:56.560648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:32:56.561076 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:56.560698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:32:56.561076 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:56.560809 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:32:56.561076 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:56.560809 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:32:56.561076 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:56.560868 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert podName:34b1d336-8626-43f1-8ced-2764a72b207b nodeName:}" failed. No retries permitted until 2026-04-17 16:34:00.560854372 +0000 UTC m=+161.784182127 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert") pod "ingress-canary-jvmkj" (UID: "34b1d336-8626-43f1-8ced-2764a72b207b") : secret "canary-serving-cert" not found Apr 17 16:32:56.561076 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:32:56.560882 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls podName:06db4982-9078-40f4-a267-e270f44de092 nodeName:}" failed. No retries permitted until 2026-04-17 16:34:00.560875338 +0000 UTC m=+161.784203094 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls") pod "dns-default-vpvzg" (UID: "06db4982-9078-40f4-a267-e270f44de092") : secret "dns-default-metrics-tls" not found Apr 17 16:32:59.621709 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:32:59.621679 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-gwj7d" Apr 17 16:33:21.862723 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.862688 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp"] Apr 17 16:33:21.864510 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.864494 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp" Apr 17 16:33:21.867010 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.866987 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-6xgsh\"" Apr 17 16:33:21.867110 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.866991 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:21.867906 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.867889 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:21.871851 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.871831 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c"] Apr 17 16:33:21.873615 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.873599 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:21.875371 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.875353 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp"] Apr 17 16:33:21.876625 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.876606 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:21.877324 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.877307 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-jgwrw\"" Apr 17 16:33:21.877562 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.877547 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 16:33:21.878484 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.878469 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:21.893024 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.893003 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c"] Apr 17 16:33:21.963929 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.963900 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj"] Apr 17 16:33:21.965774 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.965760 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:21.972262 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.971596 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:33:21.972262 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.972210 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 16:33:21.972262 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.972238 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:33:21.972818 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.972792 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-6fnn4\"" Apr 17 16:33:21.973259 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.973238 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 16:33:21.973512 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.973482 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh"] Apr 17 16:33:21.975345 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.975328 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56bf8bb974-6ljqr"] Apr 17 16:33:21.975479 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.975463 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:21.977970 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.977951 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn"] Apr 17 16:33:21.978091 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.978075 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:21.978407 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.978389 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-n6k9g\"" Apr 17 16:33:21.978502 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.978419 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 16:33:21.978502 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.978475 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 16:33:21.978616 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.978444 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:21.978616 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.978432 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:21.979611 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.979596 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj"] Apr 17 16:33:21.979690 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.979666 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:21.981866 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.981758 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:33:21.982159 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.982142 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:33:21.982359 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.982324 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:33:21.982602 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.982533 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:33:21.982602 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.982541 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 16:33:21.982920 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.982892 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 16:33:21.983550 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.983524 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-7bsnh\"" Apr 17 16:33:21.983661 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.983594 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 16:33:21.985219 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.985090 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-zn7wr\"" Apr 17 16:33:21.985324 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.985268 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh"] Apr 17 16:33:21.987835 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.987820 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:33:21.987941 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.987904 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn"] Apr 17 16:33:21.993495 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:21.993476 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56bf8bb974-6ljqr"] Apr 17 16:33:22.024132 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.024105 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5dt7\" (UniqueName: \"kubernetes.io/projected/7324ef3b-f554-4b99-8416-ea798d8b7d08-kube-api-access-f5dt7\") pod \"volume-data-source-validator-7c6cbb6c87-q49fp\" (UID: \"7324ef3b-f554-4b99-8416-ea798d8b7d08\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp" Apr 17 16:33:22.024233 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.024135 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z5rd\" (UniqueName: \"kubernetes.io/projected/9b335685-ec6e-4bb8-8299-84a173f0dd47-kube-api-access-7z5rd\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:22.024294 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.024252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:22.125260 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:22.125260 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:22.125260 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/557e1be2-405d-4d48-bff8-fc7fef95cc7c-config\") pod \"service-ca-operator-d6fc45fc5-kzsgh\" (UID: \"557e1be2-405d-4d48-bff8-fc7fef95cc7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:22.125455 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.125455 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-trusted-ca\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.125455 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.125397 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:22.125455 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:22.125455 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.125446 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls podName:9b335685-ec6e-4bb8-8299-84a173f0dd47 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:22.625430802 +0000 UTC m=+123.848758558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-cp58c" (UID: "9b335685-ec6e-4bb8-8299-84a173f0dd47") : secret "samples-operator-tls" not found Apr 17 16:33:22.125672 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125521 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5dt7\" (UniqueName: \"kubernetes.io/projected/7324ef3b-f554-4b99-8416-ea798d8b7d08-kube-api-access-f5dt7\") pod \"volume-data-source-validator-7c6cbb6c87-q49fp\" (UID: \"7324ef3b-f554-4b99-8416-ea798d8b7d08\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp" Apr 17 16:33:22.125672 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125552 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7z5rd\" (UniqueName: \"kubernetes.io/projected/9b335685-ec6e-4bb8-8299-84a173f0dd47-kube-api-access-7z5rd\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:22.125672 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/557e1be2-405d-4d48-bff8-fc7fef95cc7c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kzsgh\" (UID: \"557e1be2-405d-4d48-bff8-fc7fef95cc7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:22.125672 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125606 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-image-registry-private-configuration\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.125672 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-registry-certificates\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.125857 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125677 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-installation-pull-secrets\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.125857 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125700 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkgfj\" (UniqueName: \"kubernetes.io/projected/557e1be2-405d-4d48-bff8-fc7fef95cc7c-kube-api-access-hkgfj\") pod \"service-ca-operator-d6fc45fc5-kzsgh\" (UID: \"557e1be2-405d-4d48-bff8-fc7fef95cc7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:22.125857 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125762 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-bound-sa-token\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.125857 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125778 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wkqhn\" (UID: \"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:22.125857 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8dnv\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-kube-api-access-l8dnv\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.126002 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125861 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdstg\" (UniqueName: \"kubernetes.io/projected/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-kube-api-access-zdstg\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:22.126002 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125882 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wkqhn\" (UID: \"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:22.126002 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125899 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzj5j\" (UniqueName: \"kubernetes.io/projected/9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf-kube-api-access-vzj5j\") pod \"kube-storage-version-migrator-operator-6769c5d45-wkqhn\" (UID: \"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:22.126002 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.125925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b646c71-de31-4f51-bd49-9a2445937672-ca-trust-extracted\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.134121 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.134093 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5dt7\" (UniqueName: \"kubernetes.io/projected/7324ef3b-f554-4b99-8416-ea798d8b7d08-kube-api-access-f5dt7\") pod \"volume-data-source-validator-7c6cbb6c87-q49fp\" (UID: \"7324ef3b-f554-4b99-8416-ea798d8b7d08\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp" Apr 17 16:33:22.134233 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.134218 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z5rd\" (UniqueName: \"kubernetes.io/projected/9b335685-ec6e-4bb8-8299-84a173f0dd47-kube-api-access-7z5rd\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:22.173226 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.173204 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp" Apr 17 16:33:22.227357 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.227317 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkgfj\" (UniqueName: \"kubernetes.io/projected/557e1be2-405d-4d48-bff8-fc7fef95cc7c-kube-api-access-hkgfj\") pod \"service-ca-operator-d6fc45fc5-kzsgh\" (UID: \"557e1be2-405d-4d48-bff8-fc7fef95cc7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:22.227484 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.227384 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-bound-sa-token\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.227484 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.227417 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wkqhn\" (UID: \"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.227817 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8dnv\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-kube-api-access-l8dnv\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.227892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdstg\" (UniqueName: \"kubernetes.io/projected/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-kube-api-access-zdstg\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.227919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wkqhn\" (UID: \"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.227947 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzj5j\" (UniqueName: \"kubernetes.io/projected/9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf-kube-api-access-vzj5j\") pod \"kube-storage-version-migrator-operator-6769c5d45-wkqhn\" (UID: \"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.227989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b646c71-de31-4f51-bd49-9a2445937672-ca-trust-extracted\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228047 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-wkqhn\" (UID: \"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/557e1be2-405d-4d48-bff8-fc7fef95cc7c-config\") pod \"service-ca-operator-d6fc45fc5-kzsgh\" (UID: \"557e1be2-405d-4d48-bff8-fc7fef95cc7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.228149 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228168 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-trusted-ca\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.228207 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls podName:857db8bf-55a7-4dbe-a3e9-277f452b9fb9 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:22.728189172 +0000 UTC m=+123.951516933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8psjj" (UID: "857db8bf-55a7-4dbe-a3e9-277f452b9fb9") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228226 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228261 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/557e1be2-405d-4d48-bff8-fc7fef95cc7c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kzsgh\" (UID: \"557e1be2-405d-4d48-bff8-fc7fef95cc7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:22.228542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228287 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-image-registry-private-configuration\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.229409 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228324 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-registry-certificates\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.229409 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.228442 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:22.229409 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.228455 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bf8bb974-6ljqr: secret "image-registry-tls" not found Apr 17 16:33:22.229409 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.228527 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls podName:3b646c71-de31-4f51-bd49-9a2445937672 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:22.728510562 +0000 UTC m=+123.951838317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls") pod "image-registry-56bf8bb974-6ljqr" (UID: "3b646c71-de31-4f51-bd49-9a2445937672") : secret "image-registry-tls" not found Apr 17 16:33:22.229409 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/557e1be2-405d-4d48-bff8-fc7fef95cc7c-config\") pod \"service-ca-operator-d6fc45fc5-kzsgh\" (UID: \"557e1be2-405d-4d48-bff8-fc7fef95cc7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:22.229409 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.228349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-installation-pull-secrets\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.229409 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.229276 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-registry-certificates\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.229760 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.229477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b646c71-de31-4f51-bd49-9a2445937672-ca-trust-extracted\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.230259 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.230212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:22.232800 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.232589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-trusted-ca\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.233656 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.233513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-wkqhn\" (UID: \"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:22.233656 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.233573 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/557e1be2-405d-4d48-bff8-fc7fef95cc7c-serving-cert\") pod \"service-ca-operator-d6fc45fc5-kzsgh\" (UID: \"557e1be2-405d-4d48-bff8-fc7fef95cc7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:22.233656 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.233615 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-image-registry-private-configuration\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.234179 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.234067 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-installation-pull-secrets\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.236810 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.236779 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdstg\" (UniqueName: \"kubernetes.io/projected/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-kube-api-access-zdstg\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:22.237341 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.237117 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8dnv\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-kube-api-access-l8dnv\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.237341 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.237297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-bound-sa-token\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.238497 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.238457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzj5j\" (UniqueName: \"kubernetes.io/projected/9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf-kube-api-access-vzj5j\") pod \"kube-storage-version-migrator-operator-6769c5d45-wkqhn\" (UID: \"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:22.238497 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.238479 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkgfj\" (UniqueName: \"kubernetes.io/projected/557e1be2-405d-4d48-bff8-fc7fef95cc7c-kube-api-access-hkgfj\") pod \"service-ca-operator-d6fc45fc5-kzsgh\" (UID: \"557e1be2-405d-4d48-bff8-fc7fef95cc7c\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:22.285160 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.283817 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp"] Apr 17 16:33:22.286846 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.286828 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" Apr 17 16:33:22.288587 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:33:22.288566 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7324ef3b_f554_4b99_8416_ea798d8b7d08.slice/crio-4675ee27ed98f94b8ba1bace2f0b52366e84a3e268d0c03c8ff2c19462cc148d WatchSource:0}: Error finding container 4675ee27ed98f94b8ba1bace2f0b52366e84a3e268d0c03c8ff2c19462cc148d: Status 404 returned error can't find the container with id 4675ee27ed98f94b8ba1bace2f0b52366e84a3e268d0c03c8ff2c19462cc148d Apr 17 16:33:22.299507 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.299485 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" Apr 17 16:33:22.409665 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.409638 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh"] Apr 17 16:33:22.412425 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:33:22.412400 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod557e1be2_405d_4d48_bff8_fc7fef95cc7c.slice/crio-fc2864ca2547e299bc0a90b3a9e8d2dc0224893540ded941362efa6086c7f21a WatchSource:0}: Error finding container fc2864ca2547e299bc0a90b3a9e8d2dc0224893540ded941362efa6086c7f21a: Status 404 returned error can't find the container with id fc2864ca2547e299bc0a90b3a9e8d2dc0224893540ded941362efa6086c7f21a Apr 17 16:33:22.420089 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.420067 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn"] Apr 17 16:33:22.422479 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:33:22.422461 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bb3a924_c4e5_4338_9f7c_b9efa83f3cdf.slice/crio-37adef5d7bb4c0eacd18e39ca38881eb3d475608dd7cec09fc64abd3db652626 WatchSource:0}: Error finding container 37adef5d7bb4c0eacd18e39ca38881eb3d475608dd7cec09fc64abd3db652626: Status 404 returned error can't find the container with id 37adef5d7bb4c0eacd18e39ca38881eb3d475608dd7cec09fc64abd3db652626 Apr 17 16:33:22.632380 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.632278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:22.632521 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.632427 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:22.632521 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.632489 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls podName:9b335685-ec6e-4bb8-8299-84a173f0dd47 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:23.632473255 +0000 UTC m=+124.855801015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-cp58c" (UID: "9b335685-ec6e-4bb8-8299-84a173f0dd47") : secret "samples-operator-tls" not found Apr 17 16:33:22.725789 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.725757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" event={"ID":"557e1be2-405d-4d48-bff8-fc7fef95cc7c","Type":"ContainerStarted","Data":"fc2864ca2547e299bc0a90b3a9e8d2dc0224893540ded941362efa6086c7f21a"} Apr 17 16:33:22.726576 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.726552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp" event={"ID":"7324ef3b-f554-4b99-8416-ea798d8b7d08","Type":"ContainerStarted","Data":"4675ee27ed98f94b8ba1bace2f0b52366e84a3e268d0c03c8ff2c19462cc148d"} Apr 17 16:33:22.727438 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.727418 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" event={"ID":"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf","Type":"ContainerStarted","Data":"37adef5d7bb4c0eacd18e39ca38881eb3d475608dd7cec09fc64abd3db652626"} Apr 17 16:33:22.732844 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.732817 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:22.732943 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:22.732864 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:22.732982 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.732962 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:22.733026 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.732998 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:22.733026 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.733017 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bf8bb974-6ljqr: secret "image-registry-tls" not found Apr 17 16:33:22.733095 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.733028 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls podName:857db8bf-55a7-4dbe-a3e9-277f452b9fb9 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:23.733012816 +0000 UTC m=+124.956340577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8psjj" (UID: "857db8bf-55a7-4dbe-a3e9-277f452b9fb9") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:22.733095 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:22.733064 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls podName:3b646c71-de31-4f51-bd49-9a2445937672 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:23.733049358 +0000 UTC m=+124.956377129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls") pod "image-registry-56bf8bb974-6ljqr" (UID: "3b646c71-de31-4f51-bd49-9a2445937672") : secret "image-registry-tls" not found Apr 17 16:33:23.640958 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:23.640916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:23.641451 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:23.641123 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:23.641451 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:23.641185 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls podName:9b335685-ec6e-4bb8-8299-84a173f0dd47 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:25.641166809 +0000 UTC m=+126.864494569 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-cp58c" (UID: "9b335685-ec6e-4bb8-8299-84a173f0dd47") : secret "samples-operator-tls" not found Apr 17 16:33:23.742010 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:23.741979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:23.742174 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:23.742029 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:23.742174 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:23.742139 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:23.742174 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:23.742166 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bf8bb974-6ljqr: secret "image-registry-tls" not found Apr 17 16:33:23.742350 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:23.742221 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls podName:3b646c71-de31-4f51-bd49-9a2445937672 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:25.742201025 +0000 UTC m=+126.965528782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls") pod "image-registry-56bf8bb974-6ljqr" (UID: "3b646c71-de31-4f51-bd49-9a2445937672") : secret "image-registry-tls" not found Apr 17 16:33:23.742350 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:23.742137 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:23.742350 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:23.742295 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls podName:857db8bf-55a7-4dbe-a3e9-277f452b9fb9 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:25.742280525 +0000 UTC m=+126.965608283 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8psjj" (UID: "857db8bf-55a7-4dbe-a3e9-277f452b9fb9") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:25.657725 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:25.657691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:25.658093 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:25.657837 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:25.658093 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:25.657903 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls podName:9b335685-ec6e-4bb8-8299-84a173f0dd47 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:29.657887666 +0000 UTC m=+130.881215421 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-cp58c" (UID: "9b335685-ec6e-4bb8-8299-84a173f0dd47") : secret "samples-operator-tls" not found Apr 17 16:33:25.737847 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:25.737809 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp" event={"ID":"7324ef3b-f554-4b99-8416-ea798d8b7d08","Type":"ContainerStarted","Data":"b8decba34c53a7b8c7892c3903f83d288de32ad02457c7be9bf6f6cf61952f90"} Apr 17 16:33:25.739203 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:25.739174 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" event={"ID":"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf","Type":"ContainerStarted","Data":"797588b2a331b80d2c503a229225811f89aad7a5e2e66c40cd1de953d86c38e8"} Apr 17 16:33:25.740504 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:25.740481 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" event={"ID":"557e1be2-405d-4d48-bff8-fc7fef95cc7c","Type":"ContainerStarted","Data":"b573fedbc1e94ea946abd16d7d6a12259c6cdd365895452ff1d5b88e68b7c08d"} Apr 17 16:33:25.753529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:25.753482 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q49fp" podStartSLOduration=2.028610803 podStartE2EDuration="4.753468556s" podCreationTimestamp="2026-04-17 16:33:21 +0000 UTC" firstStartedPulling="2026-04-17 16:33:22.290903197 +0000 UTC m=+123.514230958" lastFinishedPulling="2026-04-17 16:33:25.015760939 +0000 UTC m=+126.239088711" observedRunningTime="2026-04-17 16:33:25.752356894 +0000 UTC m=+126.975684697" watchObservedRunningTime="2026-04-17 16:33:25.753468556 +0000 UTC m=+126.976796335" Apr 17 16:33:25.758593 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:25.758569 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:25.758688 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:25.758610 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:25.758750 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:25.758692 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:25.758842 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:25.758759 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls podName:857db8bf-55a7-4dbe-a3e9-277f452b9fb9 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:29.75874559 +0000 UTC m=+130.982073346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8psjj" (UID: "857db8bf-55a7-4dbe-a3e9-277f452b9fb9") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:25.758842 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:25.758781 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:25.758842 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:25.758796 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bf8bb974-6ljqr: secret "image-registry-tls" not found Apr 17 16:33:25.758842 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:25.758837 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls podName:3b646c71-de31-4f51-bd49-9a2445937672 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:29.758823782 +0000 UTC m=+130.982151545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls") pod "image-registry-56bf8bb974-6ljqr" (UID: "3b646c71-de31-4f51-bd49-9a2445937672") : secret "image-registry-tls" not found Apr 17 16:33:25.772075 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:25.772034 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" podStartSLOduration=2.174837051 podStartE2EDuration="4.772025282s" podCreationTimestamp="2026-04-17 16:33:21 +0000 UTC" firstStartedPulling="2026-04-17 16:33:22.414137412 +0000 UTC m=+123.637465169" lastFinishedPulling="2026-04-17 16:33:25.011325643 +0000 UTC m=+126.234653400" observedRunningTime="2026-04-17 16:33:25.770953595 +0000 UTC m=+126.994281374" watchObservedRunningTime="2026-04-17 16:33:25.772025282 +0000 UTC m=+126.995353059" Apr 17 16:33:25.786950 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:25.786906 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" podStartSLOduration=2.194385252 podStartE2EDuration="4.786892785s" podCreationTimestamp="2026-04-17 16:33:21 +0000 UTC" firstStartedPulling="2026-04-17 16:33:22.424047971 +0000 UTC m=+123.647375728" lastFinishedPulling="2026-04-17 16:33:25.016555503 +0000 UTC m=+126.239883261" observedRunningTime="2026-04-17 16:33:25.78540744 +0000 UTC m=+127.008735221" watchObservedRunningTime="2026-04-17 16:33:25.786892785 +0000 UTC m=+127.010220566" Apr 17 16:33:28.083448 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:28.083418 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pbmrc_ec869d90-6e5f-4329-9d3c-62938cb140e5/dns-node-resolver/0.log" Apr 17 16:33:29.082261 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:29.082220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:33:29.082443 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:29.082363 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:33:29.082443 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:29.082428 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs podName:b57612e9-f335-4d71-bdba-f06f0735eee1 nodeName:}" failed. No retries permitted until 2026-04-17 16:35:31.082412594 +0000 UTC m=+252.305740350 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs") pod "network-metrics-daemon-42tgv" (UID: "b57612e9-f335-4d71-bdba-f06f0735eee1") : secret "metrics-daemon-secret" not found Apr 17 16:33:29.083019 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:29.083002 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6jdwh_e613ab88-5ce8-4dda-a30c-57006804fdb0/node-ca/0.log" Apr 17 16:33:29.685946 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:29.685912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:29.686331 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:29.686037 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 16:33:29.686331 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:29.686098 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls podName:9b335685-ec6e-4bb8-8299-84a173f0dd47 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:37.686084406 +0000 UTC m=+138.909412162 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-cp58c" (UID: "9b335685-ec6e-4bb8-8299-84a173f0dd47") : secret "samples-operator-tls" not found Apr 17 16:33:29.786376 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:29.786341 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:29.786376 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:29.786382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:29.786625 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:29.786496 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 16:33:29.786625 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:29.786509 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-56bf8bb974-6ljqr: secret "image-registry-tls" not found Apr 17 16:33:29.786625 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:29.786506 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:29.786625 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:29.786563 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls podName:3b646c71-de31-4f51-bd49-9a2445937672 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:37.786546986 +0000 UTC m=+139.009874746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls") pod "image-registry-56bf8bb974-6ljqr" (UID: "3b646c71-de31-4f51-bd49-9a2445937672") : secret "image-registry-tls" not found Apr 17 16:33:29.786625 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:29.786577 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls podName:857db8bf-55a7-4dbe-a3e9-277f452b9fb9 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:37.786571125 +0000 UTC m=+139.009898880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8psjj" (UID: "857db8bf-55a7-4dbe-a3e9-277f452b9fb9") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:37.747546 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:37.747492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:37.750012 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:37.749987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/9b335685-ec6e-4bb8-8299-84a173f0dd47-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-cp58c\" (UID: \"9b335685-ec6e-4bb8-8299-84a173f0dd47\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:37.782170 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:37.782143 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" Apr 17 16:33:37.848666 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:37.848632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:37.848833 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:37.848683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:37.849159 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:37.848975 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:37.849159 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:37.849070 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls podName:857db8bf-55a7-4dbe-a3e9-277f452b9fb9 nodeName:}" failed. No retries permitted until 2026-04-17 16:33:53.849047497 +0000 UTC m=+155.072375257 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8psjj" (UID: "857db8bf-55a7-4dbe-a3e9-277f452b9fb9") : secret "cluster-monitoring-operator-tls" not found Apr 17 16:33:37.850983 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:37.850959 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls\") pod \"image-registry-56bf8bb974-6ljqr\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:37.894565 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:37.894532 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:37.903186 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:37.903044 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c"] Apr 17 16:33:38.012055 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:38.011984 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56bf8bb974-6ljqr"] Apr 17 16:33:38.014794 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:33:38.014767 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b646c71_de31_4f51_bd49_9a2445937672.slice/crio-9d9cd9831e5219fdd4938fade139bbc6fbd6037f6f9449012f2243ef372a1fc7 WatchSource:0}: Error finding container 9d9cd9831e5219fdd4938fade139bbc6fbd6037f6f9449012f2243ef372a1fc7: Status 404 returned error can't find the container with id 9d9cd9831e5219fdd4938fade139bbc6fbd6037f6f9449012f2243ef372a1fc7 Apr 17 16:33:38.768999 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:38.768960 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" event={"ID":"9b335685-ec6e-4bb8-8299-84a173f0dd47","Type":"ContainerStarted","Data":"cce96ff5f616b97783119d103e5885eb9effb32f13c5749d2615af08bf4e8f69"} Apr 17 16:33:38.770373 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:38.770348 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" event={"ID":"3b646c71-de31-4f51-bd49-9a2445937672","Type":"ContainerStarted","Data":"aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710"} Apr 17 16:33:38.770373 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:38.770377 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" event={"ID":"3b646c71-de31-4f51-bd49-9a2445937672","Type":"ContainerStarted","Data":"9d9cd9831e5219fdd4938fade139bbc6fbd6037f6f9449012f2243ef372a1fc7"} Apr 17 16:33:38.770542 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:38.770512 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:33:38.791892 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:38.791846 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" podStartSLOduration=17.791832878 podStartE2EDuration="17.791832878s" podCreationTimestamp="2026-04-17 16:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:38.791008837 +0000 UTC m=+140.014336627" watchObservedRunningTime="2026-04-17 16:33:38.791832878 +0000 UTC m=+140.015160656" Apr 17 16:33:39.774371 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:39.774338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" event={"ID":"9b335685-ec6e-4bb8-8299-84a173f0dd47","Type":"ContainerStarted","Data":"0be405e1d70fdbf2d5f4485a10990c7799950da43972bfeefd379fce33468729"} Apr 17 16:33:39.774706 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:39.774378 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" event={"ID":"9b335685-ec6e-4bb8-8299-84a173f0dd47","Type":"ContainerStarted","Data":"33492aa71e1cf28b6f259c98365287c0033b01d547b6d463313f524ef679ebe5"} Apr 17 16:33:39.790747 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:39.790698 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-cp58c" podStartSLOduration=17.149306835 podStartE2EDuration="18.790684611s" podCreationTimestamp="2026-04-17 16:33:21 +0000 UTC" firstStartedPulling="2026-04-17 16:33:37.941834114 +0000 UTC m=+139.165161873" lastFinishedPulling="2026-04-17 16:33:39.583211886 +0000 UTC m=+140.806539649" observedRunningTime="2026-04-17 16:33:39.789541686 +0000 UTC m=+141.012869463" watchObservedRunningTime="2026-04-17 16:33:39.790684611 +0000 UTC m=+141.014012388" Apr 17 16:33:47.930378 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:47.930343 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-r6l2s"] Apr 17 16:33:47.934767 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:47.934750 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:47.939790 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:47.939719 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:33:47.940356 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:47.940088 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:33:47.940356 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:47.940256 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:33:47.940521 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:47.940355 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:33:47.944763 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:47.944725 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-pr7rd\"" Apr 17 16:33:47.956582 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:47.956560 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-r6l2s"] Apr 17 16:33:47.970014 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:47.969992 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56bf8bb974-6ljqr"] Apr 17 16:33:48.021657 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.021619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a65f5aab-b1e8-48b9-80dc-4d625056d509-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.021657 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.021661 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6bx7\" (UniqueName: \"kubernetes.io/projected/a65f5aab-b1e8-48b9-80dc-4d625056d509-kube-api-access-d6bx7\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.021930 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.021796 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a65f5aab-b1e8-48b9-80dc-4d625056d509-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.021930 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.021857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a65f5aab-b1e8-48b9-80dc-4d625056d509-data-volume\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.021930 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.021887 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a65f5aab-b1e8-48b9-80dc-4d625056d509-crio-socket\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.029959 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.029932 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cfd58fd7f-ztnxd"] Apr 17 16:33:48.033377 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.033362 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.060960 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.060931 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cfd58fd7f-ztnxd"] Apr 17 16:33:48.122287 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122255 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a65f5aab-b1e8-48b9-80dc-4d625056d509-data-volume\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.122454 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a65f5aab-b1e8-48b9-80dc-4d625056d509-crio-socket\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.122454 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-trusted-ca\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.122454 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122338 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-registry-tls\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.122454 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122400 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a65f5aab-b1e8-48b9-80dc-4d625056d509-crio-socket\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.122454 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122403 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-image-registry-private-configuration\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.122454 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122446 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-registry-certificates\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.122685 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-installation-pull-secrets\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.122685 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a65f5aab-b1e8-48b9-80dc-4d625056d509-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.122685 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122556 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-bound-sa-token\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.122685 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzvbb\" (UniqueName: \"kubernetes.io/projected/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-kube-api-access-hzvbb\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.122685 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122590 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-ca-trust-extracted\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.122685 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122558 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a65f5aab-b1e8-48b9-80dc-4d625056d509-data-volume\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.122685 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122623 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6bx7\" (UniqueName: \"kubernetes.io/projected/a65f5aab-b1e8-48b9-80dc-4d625056d509-kube-api-access-d6bx7\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.122948 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.122750 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a65f5aab-b1e8-48b9-80dc-4d625056d509-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.123223 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.123207 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a65f5aab-b1e8-48b9-80dc-4d625056d509-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.124960 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.124937 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a65f5aab-b1e8-48b9-80dc-4d625056d509-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.138455 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.138425 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6bx7\" (UniqueName: \"kubernetes.io/projected/a65f5aab-b1e8-48b9-80dc-4d625056d509-kube-api-access-d6bx7\") pod \"insights-runtime-extractor-r6l2s\" (UID: \"a65f5aab-b1e8-48b9-80dc-4d625056d509\") " pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.223061 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.222987 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-bound-sa-token\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.223061 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.223020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzvbb\" (UniqueName: \"kubernetes.io/projected/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-kube-api-access-hzvbb\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.223061 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.223036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-ca-trust-extracted\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.223266 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.223215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-trusted-ca\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.223266 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.223258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-registry-tls\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.223356 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.223288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-image-registry-private-configuration\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.223356 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.223316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-registry-certificates\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.223356 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.223339 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-installation-pull-secrets\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.223519 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.223383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-ca-trust-extracted\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.224166 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.224138 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-registry-certificates\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.224288 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.224142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-trusted-ca\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.226041 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.226014 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-image-registry-private-configuration\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.226137 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.226122 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-registry-tls\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.226185 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.226170 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-installation-pull-secrets\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.239034 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.239011 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-bound-sa-token\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.239499 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.239484 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzvbb\" (UniqueName: \"kubernetes.io/projected/ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0-kube-api-access-hzvbb\") pod \"image-registry-5cfd58fd7f-ztnxd\" (UID: \"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0\") " pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.245290 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.245271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-r6l2s" Apr 17 16:33:48.342262 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.342231 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.365066 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.365040 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-r6l2s"] Apr 17 16:33:48.368726 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:33:48.368700 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda65f5aab_b1e8_48b9_80dc_4d625056d509.slice/crio-75589531893a69f689ac5b563aab9d90534e4eb5ceb833171f9441c3ed4c7c8e WatchSource:0}: Error finding container 75589531893a69f689ac5b563aab9d90534e4eb5ceb833171f9441c3ed4c7c8e: Status 404 returned error can't find the container with id 75589531893a69f689ac5b563aab9d90534e4eb5ceb833171f9441c3ed4c7c8e Apr 17 16:33:48.487973 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.487900 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cfd58fd7f-ztnxd"] Apr 17 16:33:48.490478 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:33:48.490453 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef63f7b9_fc9e_4558_85c1_fd232a8cc5c0.slice/crio-cf07a4711015faa45493961c0c732571c01d7dea0639d41f0221cc81f2004376 WatchSource:0}: Error finding container cf07a4711015faa45493961c0c732571c01d7dea0639d41f0221cc81f2004376: Status 404 returned error can't find the container with id cf07a4711015faa45493961c0c732571c01d7dea0639d41f0221cc81f2004376 Apr 17 16:33:48.798159 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.798123 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6l2s" event={"ID":"a65f5aab-b1e8-48b9-80dc-4d625056d509","Type":"ContainerStarted","Data":"fbeb96abbf6216babe7d0327d495aa674a8263100dfec54e6b986bf4b52bea09"} Apr 17 16:33:48.798159 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.798163 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6l2s" event={"ID":"a65f5aab-b1e8-48b9-80dc-4d625056d509","Type":"ContainerStarted","Data":"75589531893a69f689ac5b563aab9d90534e4eb5ceb833171f9441c3ed4c7c8e"} Apr 17 16:33:48.799405 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.799381 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" event={"ID":"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0","Type":"ContainerStarted","Data":"dee09ddcb32cfb9a1040b11288469281cf9898f7ce99bbf3c42cabb3e34a18aa"} Apr 17 16:33:48.799405 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.799410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" event={"ID":"ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0","Type":"ContainerStarted","Data":"cf07a4711015faa45493961c0c732571c01d7dea0639d41f0221cc81f2004376"} Apr 17 16:33:48.799535 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.799518 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:33:48.834142 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:48.834085 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" podStartSLOduration=1.8340663529999999 podStartE2EDuration="1.834066353s" podCreationTimestamp="2026-04-17 16:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:33:48.832895408 +0000 UTC m=+150.056223265" watchObservedRunningTime="2026-04-17 16:33:48.834066353 +0000 UTC m=+150.057394130" Apr 17 16:33:49.803719 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:49.803675 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6l2s" event={"ID":"a65f5aab-b1e8-48b9-80dc-4d625056d509","Type":"ContainerStarted","Data":"2be81849e084994cf11ccdb2c8093ee01ca17689a0877cab8d15061f0d0ac311"} Apr 17 16:33:50.808518 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:50.808487 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-r6l2s" event={"ID":"a65f5aab-b1e8-48b9-80dc-4d625056d509","Type":"ContainerStarted","Data":"13a9531ad1c93c2cf982f1c7cde0a5c9791d05870f75aeabe7e6fefa8e30d691"} Apr 17 16:33:50.829556 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:50.829506 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-r6l2s" podStartSLOduration=1.626532724 podStartE2EDuration="3.829494157s" podCreationTimestamp="2026-04-17 16:33:47 +0000 UTC" firstStartedPulling="2026-04-17 16:33:48.441499418 +0000 UTC m=+149.664827173" lastFinishedPulling="2026-04-17 16:33:50.644460849 +0000 UTC m=+151.867788606" observedRunningTime="2026-04-17 16:33:50.82806321 +0000 UTC m=+152.051390988" watchObservedRunningTime="2026-04-17 16:33:50.829494157 +0000 UTC m=+152.052821934" Apr 17 16:33:53.866158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:53.866123 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:53.868416 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:53.868397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/857db8bf-55a7-4dbe-a3e9-277f452b9fb9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8psjj\" (UID: \"857db8bf-55a7-4dbe-a3e9-277f452b9fb9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:54.077476 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:54.077442 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" Apr 17 16:33:54.205275 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:54.205244 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj"] Apr 17 16:33:54.207874 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:33:54.207840 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod857db8bf_55a7_4dbe_a3e9_277f452b9fb9.slice/crio-e8e9085977c937f9ea23dca135beab8f13824999bb7983001234ec07c7f38572 WatchSource:0}: Error finding container e8e9085977c937f9ea23dca135beab8f13824999bb7983001234ec07c7f38572: Status 404 returned error can't find the container with id e8e9085977c937f9ea23dca135beab8f13824999bb7983001234ec07c7f38572 Apr 17 16:33:54.819131 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:54.819090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" event={"ID":"857db8bf-55a7-4dbe-a3e9-277f452b9fb9","Type":"ContainerStarted","Data":"e8e9085977c937f9ea23dca135beab8f13824999bb7983001234ec07c7f38572"} Apr 17 16:33:55.708599 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:55.708548 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vpvzg" podUID="06db4982-9078-40f4-a267-e270f44de092" Apr 17 16:33:55.715717 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:55.715675 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jvmkj" podUID="34b1d336-8626-43f1-8ced-2764a72b207b" Apr 17 16:33:55.821188 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:55.821160 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:33:55.821338 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:55.821172 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vpvzg" Apr 17 16:33:56.824742 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:56.824701 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" event={"ID":"857db8bf-55a7-4dbe-a3e9-277f452b9fb9","Type":"ContainerStarted","Data":"74d5fdcc6c740777928c18dd9bb0c1bf3199f81ffec0f29dad503c09b01bf7db"} Apr 17 16:33:56.841356 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:56.841315 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8psjj" podStartSLOduration=33.888019642 podStartE2EDuration="35.841301182s" podCreationTimestamp="2026-04-17 16:33:21 +0000 UTC" firstStartedPulling="2026-04-17 16:33:54.209647967 +0000 UTC m=+155.432975737" lastFinishedPulling="2026-04-17 16:33:56.162929509 +0000 UTC m=+157.386257277" observedRunningTime="2026-04-17 16:33:56.840562346 +0000 UTC m=+158.063890125" watchObservedRunningTime="2026-04-17 16:33:56.841301182 +0000 UTC m=+158.064628958" Apr 17 16:33:57.386833 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:33:57.386801 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-42tgv" podUID="b57612e9-f335-4d71-bdba-f06f0735eee1" Apr 17 16:33:57.974062 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:57.974031 2574 patch_prober.go:28] interesting pod/image-registry-56bf8bb974-6ljqr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:33:57.974413 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:33:57.974079 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" podUID="3b646c71-de31-4f51-bd49-9a2445937672" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:34:00.620460 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:00.620426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:34:00.620860 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:00.620478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:34:00.622744 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:00.622706 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06db4982-9078-40f4-a267-e270f44de092-metrics-tls\") pod \"dns-default-vpvzg\" (UID: \"06db4982-9078-40f4-a267-e270f44de092\") " pod="openshift-dns/dns-default-vpvzg" Apr 17 16:34:00.622856 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:00.622783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34b1d336-8626-43f1-8ced-2764a72b207b-cert\") pod \"ingress-canary-jvmkj\" (UID: \"34b1d336-8626-43f1-8ced-2764a72b207b\") " pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:34:00.925354 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:00.925277 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dbwrt\"" Apr 17 16:34:00.926136 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:00.926111 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ttskh\"" Apr 17 16:34:00.933058 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:00.933041 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vpvzg" Apr 17 16:34:00.933149 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:00.933122 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jvmkj" Apr 17 16:34:01.054764 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:01.054716 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jvmkj"] Apr 17 16:34:01.058545 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:34:01.058521 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34b1d336_8626_43f1_8ced_2764a72b207b.slice/crio-7c44c54df4a0c845ea7230d902881a29edf1ca29f0a5913acf96c87b0c5de08c WatchSource:0}: Error finding container 7c44c54df4a0c845ea7230d902881a29edf1ca29f0a5913acf96c87b0c5de08c: Status 404 returned error can't find the container with id 7c44c54df4a0c845ea7230d902881a29edf1ca29f0a5913acf96c87b0c5de08c Apr 17 16:34:01.066380 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:01.066355 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vpvzg"] Apr 17 16:34:01.070067 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:34:01.070045 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06db4982_9078_40f4_a267_e270f44de092.slice/crio-72af0f18f65465027f2cae0114b0c4a07bc5044cac927caa2eadad5e3385d998 WatchSource:0}: Error finding container 72af0f18f65465027f2cae0114b0c4a07bc5044cac927caa2eadad5e3385d998: Status 404 returned error can't find the container with id 72af0f18f65465027f2cae0114b0c4a07bc5044cac927caa2eadad5e3385d998 Apr 17 16:34:01.837909 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:01.837867 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jvmkj" event={"ID":"34b1d336-8626-43f1-8ced-2764a72b207b","Type":"ContainerStarted","Data":"7c44c54df4a0c845ea7230d902881a29edf1ca29f0a5913acf96c87b0c5de08c"} Apr 17 16:34:01.839046 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:01.839011 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vpvzg" event={"ID":"06db4982-9078-40f4-a267-e270f44de092","Type":"ContainerStarted","Data":"72af0f18f65465027f2cae0114b0c4a07bc5044cac927caa2eadad5e3385d998"} Apr 17 16:34:03.845139 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:03.845103 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jvmkj" event={"ID":"34b1d336-8626-43f1-8ced-2764a72b207b","Type":"ContainerStarted","Data":"5490ecf74e08233a7b7b6ff414aeda24360599040260cf9b16f689d209874796"} Apr 17 16:34:03.846575 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:03.846549 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vpvzg" event={"ID":"06db4982-9078-40f4-a267-e270f44de092","Type":"ContainerStarted","Data":"187dc3ee9de298b2e3b930a64355121d5b645fd29d161264ff1f5a71ac672064"} Apr 17 16:34:03.846575 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:03.846577 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vpvzg" event={"ID":"06db4982-9078-40f4-a267-e270f44de092","Type":"ContainerStarted","Data":"f34d41f74d11a2239dd59e58f20d139aaf76408cf18607ef7c2df5c96520fb46"} Apr 17 16:34:03.846748 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:03.846659 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vpvzg" Apr 17 16:34:03.861165 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:03.861120 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jvmkj" podStartSLOduration=130.073748986 podStartE2EDuration="2m11.861105688s" podCreationTimestamp="2026-04-17 16:31:52 +0000 UTC" firstStartedPulling="2026-04-17 16:34:01.060341359 +0000 UTC m=+162.283669116" lastFinishedPulling="2026-04-17 16:34:02.847698048 +0000 UTC m=+164.071025818" observedRunningTime="2026-04-17 16:34:03.861002163 +0000 UTC m=+165.084329940" watchObservedRunningTime="2026-04-17 16:34:03.861105688 +0000 UTC m=+165.084433465" Apr 17 16:34:03.878975 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:03.878936 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vpvzg" podStartSLOduration=130.106221427 podStartE2EDuration="2m11.878923891s" podCreationTimestamp="2026-04-17 16:31:52 +0000 UTC" firstStartedPulling="2026-04-17 16:34:01.072223863 +0000 UTC m=+162.295551619" lastFinishedPulling="2026-04-17 16:34:02.844926324 +0000 UTC m=+164.068254083" observedRunningTime="2026-04-17 16:34:03.877686541 +0000 UTC m=+165.101014318" watchObservedRunningTime="2026-04-17 16:34:03.878923891 +0000 UTC m=+165.102251738" Apr 17 16:34:05.114603 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.114567 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hh4b6"] Apr 17 16:34:05.117658 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.117638 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.121270 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.121250 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:34:05.121657 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.121638 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:34:05.121752 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.121723 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:34:05.121823 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.121808 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:34:05.121880 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.121822 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-p22nr\"" Apr 17 16:34:05.156064 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.156047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/256d82eb-b394-49ee-aaa8-6beac670a01e-sys\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.156156 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.156075 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-tls\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.156156 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.156103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/256d82eb-b394-49ee-aaa8-6beac670a01e-metrics-client-ca\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.156156 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.156124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.156261 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.156170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-textfile\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.156261 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.156201 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-wtmp\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.156330 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.156272 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-accelerators-collector-config\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.156330 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.156308 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86cxt\" (UniqueName: \"kubernetes.io/projected/256d82eb-b394-49ee-aaa8-6beac670a01e-kube-api-access-86cxt\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.156410 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.156343 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/256d82eb-b394-49ee-aaa8-6beac670a01e-root\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.257678 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.257647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-tls\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.257840 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.257687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/256d82eb-b394-49ee-aaa8-6beac670a01e-metrics-client-ca\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.257840 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.257707 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.257840 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.257745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-textfile\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.257840 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.257775 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-wtmp\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.257840 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.257798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-accelerators-collector-config\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.257840 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.257815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-86cxt\" (UniqueName: \"kubernetes.io/projected/256d82eb-b394-49ee-aaa8-6beac670a01e-kube-api-access-86cxt\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.257840 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.257830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/256d82eb-b394-49ee-aaa8-6beac670a01e-root\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.257840 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:34:05.257831 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:34:05.258192 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.257886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/256d82eb-b394-49ee-aaa8-6beac670a01e-root\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.258192 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:34:05.257905 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-tls podName:256d82eb-b394-49ee-aaa8-6beac670a01e nodeName:}" failed. No retries permitted until 2026-04-17 16:34:05.757884721 +0000 UTC m=+166.981212484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-tls") pod "node-exporter-hh4b6" (UID: "256d82eb-b394-49ee-aaa8-6beac670a01e") : secret "node-exporter-tls" not found Apr 17 16:34:05.258192 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.257958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-wtmp\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.258192 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.258004 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/256d82eb-b394-49ee-aaa8-6beac670a01e-sys\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.258192 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.258073 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/256d82eb-b394-49ee-aaa8-6beac670a01e-sys\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.258192 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.258188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-textfile\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.258388 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.258284 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/256d82eb-b394-49ee-aaa8-6beac670a01e-metrics-client-ca\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.258388 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.258311 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-accelerators-collector-config\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.260047 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.260026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.273527 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.273506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-86cxt\" (UniqueName: \"kubernetes.io/projected/256d82eb-b394-49ee-aaa8-6beac670a01e-kube-api-access-86cxt\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.762000 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.761917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-tls\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:05.764169 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:05.764147 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/256d82eb-b394-49ee-aaa8-6beac670a01e-node-exporter-tls\") pod \"node-exporter-hh4b6\" (UID: \"256d82eb-b394-49ee-aaa8-6beac670a01e\") " pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:06.026336 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:06.026309 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hh4b6" Apr 17 16:34:06.034703 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:34:06.034677 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod256d82eb_b394_49ee_aaa8_6beac670a01e.slice/crio-6f15df51082192a2a6c6b53d305c8719379a8acf56b7046bd8583593cbeea4c8 WatchSource:0}: Error finding container 6f15df51082192a2a6c6b53d305c8719379a8acf56b7046bd8583593cbeea4c8: Status 404 returned error can't find the container with id 6f15df51082192a2a6c6b53d305c8719379a8acf56b7046bd8583593cbeea4c8 Apr 17 16:34:06.855920 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:06.855895 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hh4b6" event={"ID":"256d82eb-b394-49ee-aaa8-6beac670a01e","Type":"ContainerStarted","Data":"44ad71244efc04f68cbd4fa552d79f1c65bd46d45942e74be85fdfffdd5cde0a"} Apr 17 16:34:06.856264 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:06.855935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hh4b6" event={"ID":"256d82eb-b394-49ee-aaa8-6beac670a01e","Type":"ContainerStarted","Data":"6f15df51082192a2a6c6b53d305c8719379a8acf56b7046bd8583593cbeea4c8"} Apr 17 16:34:07.859230 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:07.859201 2574 generic.go:358] "Generic (PLEG): container finished" podID="256d82eb-b394-49ee-aaa8-6beac670a01e" containerID="44ad71244efc04f68cbd4fa552d79f1c65bd46d45942e74be85fdfffdd5cde0a" exitCode=0 Apr 17 16:34:07.859576 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:07.859252 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hh4b6" event={"ID":"256d82eb-b394-49ee-aaa8-6beac670a01e","Type":"ContainerDied","Data":"44ad71244efc04f68cbd4fa552d79f1c65bd46d45942e74be85fdfffdd5cde0a"} Apr 17 16:34:07.974633 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:07.974608 2574 patch_prober.go:28] interesting pod/image-registry-56bf8bb974-6ljqr container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:34:07.974744 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:07.974652 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" podUID="3b646c71-de31-4f51-bd49-9a2445937672" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:34:08.346665 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:08.346632 2574 patch_prober.go:28] interesting pod/image-registry-5cfd58fd7f-ztnxd container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 16:34:08.346842 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:08.346686 2574 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" podUID="ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 16:34:08.863479 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:08.863442 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hh4b6" event={"ID":"256d82eb-b394-49ee-aaa8-6beac670a01e","Type":"ContainerStarted","Data":"fec2c39286c96cf32d9607f7afe2eb5a13d42a7a8c35942edb4fd9cade334264"} Apr 17 16:34:08.863479 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:08.863476 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hh4b6" event={"ID":"256d82eb-b394-49ee-aaa8-6beac670a01e","Type":"ContainerStarted","Data":"75b434bd150ea470635143362801349cf19a8e88c9c193b3cce2bd6632350be6"} Apr 17 16:34:08.900791 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:08.900748 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hh4b6" podStartSLOduration=3.161187429 podStartE2EDuration="3.900716432s" podCreationTimestamp="2026-04-17 16:34:05 +0000 UTC" firstStartedPulling="2026-04-17 16:34:06.036354595 +0000 UTC m=+167.259682358" lastFinishedPulling="2026-04-17 16:34:06.775883606 +0000 UTC m=+167.999211361" observedRunningTime="2026-04-17 16:34:08.899887867 +0000 UTC m=+170.123215648" watchObservedRunningTime="2026-04-17 16:34:08.900716432 +0000 UTC m=+170.124044209" Apr 17 16:34:09.808005 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:09.807980 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cfd58fd7f-ztnxd" Apr 17 16:34:09.874853 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:09.874822 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p"] Apr 17 16:34:09.877671 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:09.877653 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p" Apr 17 16:34:09.880435 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:09.880413 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 16:34:09.880618 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:09.880600 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-9zmx2\"" Apr 17 16:34:09.889303 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:09.889282 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p"] Apr 17 16:34:09.996395 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:09.996362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5820ffd9-b681-4007-8950-dab81ba98039-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-gq92p\" (UID: \"5820ffd9-b681-4007-8950-dab81ba98039\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p" Apr 17 16:34:10.097686 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:10.097615 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5820ffd9-b681-4007-8950-dab81ba98039-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-gq92p\" (UID: \"5820ffd9-b681-4007-8950-dab81ba98039\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p" Apr 17 16:34:10.100052 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:10.100027 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5820ffd9-b681-4007-8950-dab81ba98039-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-gq92p\" (UID: \"5820ffd9-b681-4007-8950-dab81ba98039\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p" Apr 17 16:34:10.186022 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:10.185984 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p" Apr 17 16:34:10.300366 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:10.300304 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p"] Apr 17 16:34:10.302335 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:34:10.302306 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5820ffd9_b681_4007_8950_dab81ba98039.slice/crio-a96ebb0bc00042b5da3867c834c9ef31b26735cf88a33476d1254e415df722b2 WatchSource:0}: Error finding container a96ebb0bc00042b5da3867c834c9ef31b26735cf88a33476d1254e415df722b2: Status 404 returned error can't find the container with id a96ebb0bc00042b5da3867c834c9ef31b26735cf88a33476d1254e415df722b2 Apr 17 16:34:10.870609 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:10.870574 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p" event={"ID":"5820ffd9-b681-4007-8950-dab81ba98039","Type":"ContainerStarted","Data":"a96ebb0bc00042b5da3867c834c9ef31b26735cf88a33476d1254e415df722b2"} Apr 17 16:34:11.377437 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:11.377396 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:34:11.876375 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:11.876342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p" event={"ID":"5820ffd9-b681-4007-8950-dab81ba98039","Type":"ContainerStarted","Data":"5c9efe1fa7f4129d02350887b9f50c6555b000b80a862220983ebaaebf1ad6fc"} Apr 17 16:34:11.876589 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:11.876569 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p" Apr 17 16:34:11.881113 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:11.881052 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p" Apr 17 16:34:11.894381 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:11.894340 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-gq92p" podStartSLOduration=1.605408187 podStartE2EDuration="2.894328685s" podCreationTimestamp="2026-04-17 16:34:09 +0000 UTC" firstStartedPulling="2026-04-17 16:34:10.304178875 +0000 UTC m=+171.527506636" lastFinishedPulling="2026-04-17 16:34:11.593099375 +0000 UTC m=+172.816427134" observedRunningTime="2026-04-17 16:34:11.894300218 +0000 UTC m=+173.117627996" watchObservedRunningTime="2026-04-17 16:34:11.894328685 +0000 UTC m=+173.117656470" Apr 17 16:34:12.988811 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:12.988772 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" podUID="3b646c71-de31-4f51-bd49-9a2445937672" containerName="registry" containerID="cri-o://aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710" gracePeriod=30 Apr 17 16:34:13.219322 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.219296 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:34:13.325008 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.324976 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-image-registry-private-configuration\") pod \"3b646c71-de31-4f51-bd49-9a2445937672\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " Apr 17 16:34:13.325176 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325025 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b646c71-de31-4f51-bd49-9a2445937672-ca-trust-extracted\") pod \"3b646c71-de31-4f51-bd49-9a2445937672\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " Apr 17 16:34:13.325176 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325052 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-trusted-ca\") pod \"3b646c71-de31-4f51-bd49-9a2445937672\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " Apr 17 16:34:13.325176 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325162 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-registry-certificates\") pod \"3b646c71-de31-4f51-bd49-9a2445937672\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " Apr 17 16:34:13.325349 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325221 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-installation-pull-secrets\") pod \"3b646c71-de31-4f51-bd49-9a2445937672\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " Apr 17 16:34:13.325349 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325254 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-bound-sa-token\") pod \"3b646c71-de31-4f51-bd49-9a2445937672\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " Apr 17 16:34:13.325349 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325288 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls\") pod \"3b646c71-de31-4f51-bd49-9a2445937672\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " Apr 17 16:34:13.325349 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325317 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8dnv\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-kube-api-access-l8dnv\") pod \"3b646c71-de31-4f51-bd49-9a2445937672\" (UID: \"3b646c71-de31-4f51-bd49-9a2445937672\") " Apr 17 16:34:13.325548 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325468 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3b646c71-de31-4f51-bd49-9a2445937672" (UID: "3b646c71-de31-4f51-bd49-9a2445937672"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:13.325608 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325552 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3b646c71-de31-4f51-bd49-9a2445937672" (UID: "3b646c71-de31-4f51-bd49-9a2445937672"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:34:13.325688 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325662 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-trusted-ca\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:34:13.325781 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.325694 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3b646c71-de31-4f51-bd49-9a2445937672-registry-certificates\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:34:13.327615 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.327572 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3b646c71-de31-4f51-bd49-9a2445937672" (UID: "3b646c71-de31-4f51-bd49-9a2445937672"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:13.327704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.327609 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3b646c71-de31-4f51-bd49-9a2445937672" (UID: "3b646c71-de31-4f51-bd49-9a2445937672"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:13.327704 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.327672 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "3b646c71-de31-4f51-bd49-9a2445937672" (UID: "3b646c71-de31-4f51-bd49-9a2445937672"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:13.327900 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.327871 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-kube-api-access-l8dnv" (OuterVolumeSpecName: "kube-api-access-l8dnv") pod "3b646c71-de31-4f51-bd49-9a2445937672" (UID: "3b646c71-de31-4f51-bd49-9a2445937672"). InnerVolumeSpecName "kube-api-access-l8dnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:34:13.328005 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.327955 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3b646c71-de31-4f51-bd49-9a2445937672" (UID: "3b646c71-de31-4f51-bd49-9a2445937672"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:34:13.335260 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.335240 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b646c71-de31-4f51-bd49-9a2445937672-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3b646c71-de31-4f51-bd49-9a2445937672" (UID: "3b646c71-de31-4f51-bd49-9a2445937672"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:34:13.426872 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.426849 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-image-registry-private-configuration\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:34:13.426872 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.426872 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3b646c71-de31-4f51-bd49-9a2445937672-ca-trust-extracted\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:34:13.427019 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.426887 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3b646c71-de31-4f51-bd49-9a2445937672-installation-pull-secrets\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:34:13.427019 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.426899 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-bound-sa-token\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:34:13.427019 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.426912 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-registry-tls\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:34:13.427019 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.426923 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l8dnv\" (UniqueName: \"kubernetes.io/projected/3b646c71-de31-4f51-bd49-9a2445937672-kube-api-access-l8dnv\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:34:13.851532 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.851502 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vpvzg" Apr 17 16:34:13.883446 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.883421 2574 generic.go:358] "Generic (PLEG): container finished" podID="3b646c71-de31-4f51-bd49-9a2445937672" containerID="aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710" exitCode=0 Apr 17 16:34:13.883608 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.883482 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" Apr 17 16:34:13.883608 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.883507 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" event={"ID":"3b646c71-de31-4f51-bd49-9a2445937672","Type":"ContainerDied","Data":"aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710"} Apr 17 16:34:13.883608 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.883547 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56bf8bb974-6ljqr" event={"ID":"3b646c71-de31-4f51-bd49-9a2445937672","Type":"ContainerDied","Data":"9d9cd9831e5219fdd4938fade139bbc6fbd6037f6f9449012f2243ef372a1fc7"} Apr 17 16:34:13.883608 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.883567 2574 scope.go:117] "RemoveContainer" containerID="aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710" Apr 17 16:34:13.892833 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.892814 2574 scope.go:117] "RemoveContainer" containerID="aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710" Apr 17 16:34:13.893113 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:34:13.893092 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710\": container with ID starting with aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710 not found: ID does not exist" containerID="aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710" Apr 17 16:34:13.893178 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.893121 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710"} err="failed to get container status \"aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710\": rpc error: code = NotFound desc = could not find container \"aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710\": container with ID starting with aa2d3a07f119b6f5034163bb2873a2f236a3c17561e6dcf482e3d9b767b31710 not found: ID does not exist" Apr 17 16:34:13.899032 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.899011 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-56bf8bb974-6ljqr"] Apr 17 16:34:13.903912 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:13.903893 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-56bf8bb974-6ljqr"] Apr 17 16:34:15.381129 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:15.381093 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b646c71-de31-4f51-bd49-9a2445937672" path="/var/lib/kubelet/pods/3b646c71-de31-4f51-bd49-9a2445937672/volumes" Apr 17 16:34:17.956209 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:17.956165 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-tbr2k"] Apr 17 16:34:17.956556 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:17.956412 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b646c71-de31-4f51-bd49-9a2445937672" containerName="registry" Apr 17 16:34:17.956556 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:17.956423 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b646c71-de31-4f51-bd49-9a2445937672" containerName="registry" Apr 17 16:34:17.956556 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:17.956470 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b646c71-de31-4f51-bd49-9a2445937672" containerName="registry" Apr 17 16:34:17.961227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:17.961210 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tbr2k" Apr 17 16:34:17.963770 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:17.963726 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 16:34:17.963879 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:17.963819 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 16:34:17.963879 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:17.963863 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-sm5qj\"" Apr 17 16:34:17.969340 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:17.969318 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tbr2k"] Apr 17 16:34:18.062268 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:18.062226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t52r\" (UniqueName: \"kubernetes.io/projected/21afef29-d7fa-4797-9a01-18075be87fb6-kube-api-access-4t52r\") pod \"downloads-6bcc868b7-tbr2k\" (UID: \"21afef29-d7fa-4797-9a01-18075be87fb6\") " pod="openshift-console/downloads-6bcc868b7-tbr2k" Apr 17 16:34:18.163239 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:18.163200 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4t52r\" (UniqueName: \"kubernetes.io/projected/21afef29-d7fa-4797-9a01-18075be87fb6-kube-api-access-4t52r\") pod \"downloads-6bcc868b7-tbr2k\" (UID: \"21afef29-d7fa-4797-9a01-18075be87fb6\") " pod="openshift-console/downloads-6bcc868b7-tbr2k" Apr 17 16:34:18.171529 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:18.171494 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t52r\" (UniqueName: \"kubernetes.io/projected/21afef29-d7fa-4797-9a01-18075be87fb6-kube-api-access-4t52r\") pod \"downloads-6bcc868b7-tbr2k\" (UID: \"21afef29-d7fa-4797-9a01-18075be87fb6\") " pod="openshift-console/downloads-6bcc868b7-tbr2k" Apr 17 16:34:18.270231 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:18.270198 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tbr2k" Apr 17 16:34:18.384481 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:18.384450 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tbr2k"] Apr 17 16:34:18.387469 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:34:18.387443 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21afef29_d7fa_4797_9a01_18075be87fb6.slice/crio-2a64c42c50966e986dfa325df21930652744fc1442a0202d8c023df334ab73a5 WatchSource:0}: Error finding container 2a64c42c50966e986dfa325df21930652744fc1442a0202d8c023df334ab73a5: Status 404 returned error can't find the container with id 2a64c42c50966e986dfa325df21930652744fc1442a0202d8c023df334ab73a5 Apr 17 16:34:18.898085 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:18.898044 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tbr2k" event={"ID":"21afef29-d7fa-4797-9a01-18075be87fb6","Type":"ContainerStarted","Data":"2a64c42c50966e986dfa325df21930652744fc1442a0202d8c023df334ab73a5"} Apr 17 16:34:27.501824 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.501786 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-648fd7dbf4-7mrrh"] Apr 17 16:34:27.507615 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.507591 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.510695 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.510672 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:34:27.511752 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.511607 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:34:27.511752 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.511632 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:34:27.511752 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.511711 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:34:27.511752 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.511712 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:34:27.511997 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.511725 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-fl6zn\"" Apr 17 16:34:27.519677 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.519657 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-648fd7dbf4-7mrrh"] Apr 17 16:34:27.646633 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.646591 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-oauth-config\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.646823 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.646645 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-serving-cert\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.646823 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.646767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-config\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.646934 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.646852 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-oauth-serving-cert\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.646934 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.646908 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rz5c\" (UniqueName: \"kubernetes.io/projected/6305c116-070b-4d07-80fe-d20c9ddedf6f-kube-api-access-6rz5c\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.647026 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.646934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-service-ca\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.747811 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.747781 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-serving-cert\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.747975 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.747849 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-config\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.747975 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.747877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-oauth-serving-cert\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.747975 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.747912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rz5c\" (UniqueName: \"kubernetes.io/projected/6305c116-070b-4d07-80fe-d20c9ddedf6f-kube-api-access-6rz5c\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.748120 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.748044 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-service-ca\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.748120 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.748113 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-oauth-config\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.748711 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.748685 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-config\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.748955 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.748778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-service-ca\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.748955 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.748844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-oauth-serving-cert\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.750974 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.750949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-oauth-config\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.751085 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.751046 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-serving-cert\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.772135 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.772113 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rz5c\" (UniqueName: \"kubernetes.io/projected/6305c116-070b-4d07-80fe-d20c9ddedf6f-kube-api-access-6rz5c\") pod \"console-648fd7dbf4-7mrrh\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:27.817987 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:27.817897 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:33.688998 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:33.688971 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-648fd7dbf4-7mrrh"] Apr 17 16:34:33.690881 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:34:33.690856 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6305c116_070b_4d07_80fe_d20c9ddedf6f.slice/crio-973e7227b183264119fb6c8d2f69d20c19f051a716ec7e9c7fb4e5913313a4d6 WatchSource:0}: Error finding container 973e7227b183264119fb6c8d2f69d20c19f051a716ec7e9c7fb4e5913313a4d6: Status 404 returned error can't find the container with id 973e7227b183264119fb6c8d2f69d20c19f051a716ec7e9c7fb4e5913313a4d6 Apr 17 16:34:33.942512 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:33.942421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-648fd7dbf4-7mrrh" event={"ID":"6305c116-070b-4d07-80fe-d20c9ddedf6f","Type":"ContainerStarted","Data":"973e7227b183264119fb6c8d2f69d20c19f051a716ec7e9c7fb4e5913313a4d6"} Apr 17 16:34:33.944044 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:33.944015 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tbr2k" event={"ID":"21afef29-d7fa-4797-9a01-18075be87fb6","Type":"ContainerStarted","Data":"37c78ccfcebe063a112b9471534d34ba8dfef1534ed8da9eeea119d9c0e9dd45"} Apr 17 16:34:33.944261 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:33.944244 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-tbr2k" Apr 17 16:34:33.960771 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:33.960722 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-tbr2k" Apr 17 16:34:33.971238 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:33.971188 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-tbr2k" podStartSLOduration=1.722013622 podStartE2EDuration="16.971173199s" podCreationTimestamp="2026-04-17 16:34:17 +0000 UTC" firstStartedPulling="2026-04-17 16:34:18.389542205 +0000 UTC m=+179.612869971" lastFinishedPulling="2026-04-17 16:34:33.638701788 +0000 UTC m=+194.862029548" observedRunningTime="2026-04-17 16:34:33.969637069 +0000 UTC m=+195.192964849" watchObservedRunningTime="2026-04-17 16:34:33.971173199 +0000 UTC m=+195.194500976" Apr 17 16:34:37.959320 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:37.959279 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-648fd7dbf4-7mrrh" event={"ID":"6305c116-070b-4d07-80fe-d20c9ddedf6f","Type":"ContainerStarted","Data":"296377b2743c69338e29155e7ad94ac83d702ec30546c215cd667db01f0f34fb"} Apr 17 16:34:37.979853 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:37.979797 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-648fd7dbf4-7mrrh" podStartSLOduration=7.371247746 podStartE2EDuration="10.979774183s" podCreationTimestamp="2026-04-17 16:34:27 +0000 UTC" firstStartedPulling="2026-04-17 16:34:33.692856243 +0000 UTC m=+194.916184011" lastFinishedPulling="2026-04-17 16:34:37.301382691 +0000 UTC m=+198.524710448" observedRunningTime="2026-04-17 16:34:37.977993151 +0000 UTC m=+199.201320932" watchObservedRunningTime="2026-04-17 16:34:37.979774183 +0000 UTC m=+199.203101963" Apr 17 16:34:46.846227 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:46.846190 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-648fd7dbf4-7mrrh"] Apr 17 16:34:47.818268 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:47.818230 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:34:56.005306 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:56.005281 2574 generic.go:358] "Generic (PLEG): container finished" podID="9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf" containerID="797588b2a331b80d2c503a229225811f89aad7a5e2e66c40cd1de953d86c38e8" exitCode=0 Apr 17 16:34:56.005621 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:56.005322 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" event={"ID":"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf","Type":"ContainerDied","Data":"797588b2a331b80d2c503a229225811f89aad7a5e2e66c40cd1de953d86c38e8"} Apr 17 16:34:56.005621 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:56.005617 2574 scope.go:117] "RemoveContainer" containerID="797588b2a331b80d2c503a229225811f89aad7a5e2e66c40cd1de953d86c38e8" Apr 17 16:34:57.009066 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:57.009030 2574 generic.go:358] "Generic (PLEG): container finished" podID="557e1be2-405d-4d48-bff8-fc7fef95cc7c" containerID="b573fedbc1e94ea946abd16d7d6a12259c6cdd365895452ff1d5b88e68b7c08d" exitCode=0 Apr 17 16:34:57.009478 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:57.009107 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" event={"ID":"557e1be2-405d-4d48-bff8-fc7fef95cc7c","Type":"ContainerDied","Data":"b573fedbc1e94ea946abd16d7d6a12259c6cdd365895452ff1d5b88e68b7c08d"} Apr 17 16:34:57.009532 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:57.009479 2574 scope.go:117] "RemoveContainer" containerID="b573fedbc1e94ea946abd16d7d6a12259c6cdd365895452ff1d5b88e68b7c08d" Apr 17 16:34:57.010869 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:57.010754 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-wkqhn" event={"ID":"9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf","Type":"ContainerStarted","Data":"c608059f9c0ea2029060fe3c9777935df2f8bce692580e503cd429b9f2d2b2b1"} Apr 17 16:34:58.014402 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:34:58.014370 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-kzsgh" event={"ID":"557e1be2-405d-4d48-bff8-fc7fef95cc7c","Type":"ContainerStarted","Data":"b3211bc65d5a8f3e1fadbac6fb52277ac49a8db16f22149256a62d08c9ef0f9a"} Apr 17 16:35:11.864412 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:11.864350 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-648fd7dbf4-7mrrh" podUID="6305c116-070b-4d07-80fe-d20c9ddedf6f" containerName="console" containerID="cri-o://296377b2743c69338e29155e7ad94ac83d702ec30546c215cd667db01f0f34fb" gracePeriod=15 Apr 17 16:35:12.052521 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.052500 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-648fd7dbf4-7mrrh_6305c116-070b-4d07-80fe-d20c9ddedf6f/console/0.log" Apr 17 16:35:12.052649 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.052536 2574 generic.go:358] "Generic (PLEG): container finished" podID="6305c116-070b-4d07-80fe-d20c9ddedf6f" containerID="296377b2743c69338e29155e7ad94ac83d702ec30546c215cd667db01f0f34fb" exitCode=2 Apr 17 16:35:12.052649 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.052622 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-648fd7dbf4-7mrrh" event={"ID":"6305c116-070b-4d07-80fe-d20c9ddedf6f","Type":"ContainerDied","Data":"296377b2743c69338e29155e7ad94ac83d702ec30546c215cd667db01f0f34fb"} Apr 17 16:35:12.125976 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.125957 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-648fd7dbf4-7mrrh_6305c116-070b-4d07-80fe-d20c9ddedf6f/console/0.log" Apr 17 16:35:12.126086 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.126016 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:35:12.204434 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.204405 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-config\") pod \"6305c116-070b-4d07-80fe-d20c9ddedf6f\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " Apr 17 16:35:12.204576 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.204442 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-service-ca\") pod \"6305c116-070b-4d07-80fe-d20c9ddedf6f\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " Apr 17 16:35:12.204576 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.204465 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-oauth-serving-cert\") pod \"6305c116-070b-4d07-80fe-d20c9ddedf6f\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " Apr 17 16:35:12.204576 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.204487 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rz5c\" (UniqueName: \"kubernetes.io/projected/6305c116-070b-4d07-80fe-d20c9ddedf6f-kube-api-access-6rz5c\") pod \"6305c116-070b-4d07-80fe-d20c9ddedf6f\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " Apr 17 16:35:12.204712 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.204610 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-oauth-config\") pod \"6305c116-070b-4d07-80fe-d20c9ddedf6f\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " Apr 17 16:35:12.204712 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.204652 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-serving-cert\") pod \"6305c116-070b-4d07-80fe-d20c9ddedf6f\" (UID: \"6305c116-070b-4d07-80fe-d20c9ddedf6f\") " Apr 17 16:35:12.204919 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.204890 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-service-ca" (OuterVolumeSpecName: "service-ca") pod "6305c116-070b-4d07-80fe-d20c9ddedf6f" (UID: "6305c116-070b-4d07-80fe-d20c9ddedf6f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:12.204919 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.204907 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-config" (OuterVolumeSpecName: "console-config") pod "6305c116-070b-4d07-80fe-d20c9ddedf6f" (UID: "6305c116-070b-4d07-80fe-d20c9ddedf6f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:12.205034 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.204894 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6305c116-070b-4d07-80fe-d20c9ddedf6f" (UID: "6305c116-070b-4d07-80fe-d20c9ddedf6f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:35:12.206864 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.206837 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6305c116-070b-4d07-80fe-d20c9ddedf6f" (UID: "6305c116-070b-4d07-80fe-d20c9ddedf6f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:12.207016 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.206997 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6305c116-070b-4d07-80fe-d20c9ddedf6f" (UID: "6305c116-070b-4d07-80fe-d20c9ddedf6f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:35:12.207267 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.207244 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6305c116-070b-4d07-80fe-d20c9ddedf6f-kube-api-access-6rz5c" (OuterVolumeSpecName: "kube-api-access-6rz5c") pod "6305c116-070b-4d07-80fe-d20c9ddedf6f" (UID: "6305c116-070b-4d07-80fe-d20c9ddedf6f"). InnerVolumeSpecName "kube-api-access-6rz5c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:35:12.306018 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.305987 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-config\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:35:12.306018 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.306012 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-service-ca\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:35:12.306018 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.306022 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6305c116-070b-4d07-80fe-d20c9ddedf6f-oauth-serving-cert\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:35:12.306208 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.306032 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6rz5c\" (UniqueName: \"kubernetes.io/projected/6305c116-070b-4d07-80fe-d20c9ddedf6f-kube-api-access-6rz5c\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:35:12.306208 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.306042 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-oauth-config\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:35:12.306208 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:12.306051 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6305c116-070b-4d07-80fe-d20c9ddedf6f-console-serving-cert\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:35:13.056167 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:13.056140 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-648fd7dbf4-7mrrh_6305c116-070b-4d07-80fe-d20c9ddedf6f/console/0.log" Apr 17 16:35:13.056563 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:13.056234 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-648fd7dbf4-7mrrh" event={"ID":"6305c116-070b-4d07-80fe-d20c9ddedf6f","Type":"ContainerDied","Data":"973e7227b183264119fb6c8d2f69d20c19f051a716ec7e9c7fb4e5913313a4d6"} Apr 17 16:35:13.056563 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:13.056266 2574 scope.go:117] "RemoveContainer" containerID="296377b2743c69338e29155e7ad94ac83d702ec30546c215cd667db01f0f34fb" Apr 17 16:35:13.056563 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:13.056266 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-648fd7dbf4-7mrrh" Apr 17 16:35:13.076397 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:13.076374 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-648fd7dbf4-7mrrh"] Apr 17 16:35:13.083240 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:13.083219 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-648fd7dbf4-7mrrh"] Apr 17 16:35:13.381567 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:13.381496 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6305c116-070b-4d07-80fe-d20c9ddedf6f" path="/var/lib/kubelet/pods/6305c116-070b-4d07-80fe-d20c9ddedf6f/volumes" Apr 17 16:35:29.439849 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.439813 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2"] Apr 17 16:35:29.440342 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.440159 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6305c116-070b-4d07-80fe-d20c9ddedf6f" containerName="console" Apr 17 16:35:29.440342 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.440176 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6305c116-070b-4d07-80fe-d20c9ddedf6f" containerName="console" Apr 17 16:35:29.440342 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.440240 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6305c116-070b-4d07-80fe-d20c9ddedf6f" containerName="console" Apr 17 16:35:29.445720 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.445698 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.448484 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.448464 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-czc6g\"" Apr 17 16:35:29.448608 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.448464 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 16:35:29.448807 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.448768 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 16:35:29.448957 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.448941 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 16:35:29.449037 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.448972 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 16:35:29.449453 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.449440 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 16:35:29.454253 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.454228 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 16:35:29.454932 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.454914 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2"] Apr 17 16:35:29.544350 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.544315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-federate-client-tls\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.544350 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.544356 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.544559 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.544381 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44386569-af8e-4972-bf54-4134fe78214e-serving-certs-ca-bundle\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.544559 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.544401 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-telemeter-client-tls\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.544559 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.544454 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44386569-af8e-4972-bf54-4134fe78214e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.544559 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.544484 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zs5d\" (UniqueName: \"kubernetes.io/projected/44386569-af8e-4972-bf54-4134fe78214e-kube-api-access-6zs5d\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.544559 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.544532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-secret-telemeter-client\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.544559 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.544556 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44386569-af8e-4972-bf54-4134fe78214e-metrics-client-ca\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.645108 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.645068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-secret-telemeter-client\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.645108 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.645105 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44386569-af8e-4972-bf54-4134fe78214e-metrics-client-ca\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.645342 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.645261 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-federate-client-tls\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.645342 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.645311 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.645445 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.645355 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44386569-af8e-4972-bf54-4134fe78214e-serving-certs-ca-bundle\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.645445 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.645383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-telemeter-client-tls\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.645587 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.645546 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44386569-af8e-4972-bf54-4134fe78214e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.645713 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.645612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zs5d\" (UniqueName: \"kubernetes.io/projected/44386569-af8e-4972-bf54-4134fe78214e-kube-api-access-6zs5d\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.645872 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.645847 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/44386569-af8e-4972-bf54-4134fe78214e-metrics-client-ca\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.646048 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.646020 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44386569-af8e-4972-bf54-4134fe78214e-serving-certs-ca-bundle\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.646895 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.646871 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44386569-af8e-4972-bf54-4134fe78214e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.647792 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.647773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-federate-client-tls\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.648057 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.648034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.648132 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.648118 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-telemeter-client-tls\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.648209 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.648190 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/44386569-af8e-4972-bf54-4134fe78214e-secret-telemeter-client\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.653616 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.653597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zs5d\" (UniqueName: \"kubernetes.io/projected/44386569-af8e-4972-bf54-4134fe78214e-kube-api-access-6zs5d\") pod \"telemeter-client-54dd8cc7bb-rwpx2\" (UID: \"44386569-af8e-4972-bf54-4134fe78214e\") " pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.755433 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.755366 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" Apr 17 16:35:29.875608 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:29.875585 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2"] Apr 17 16:35:29.877594 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:35:29.877557 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44386569_af8e_4972_bf54_4134fe78214e.slice/crio-74d987d18b30a9efcc13241a4e48406a453b60c2176659d238084167079e59b6 WatchSource:0}: Error finding container 74d987d18b30a9efcc13241a4e48406a453b60c2176659d238084167079e59b6: Status 404 returned error can't find the container with id 74d987d18b30a9efcc13241a4e48406a453b60c2176659d238084167079e59b6 Apr 17 16:35:30.100745 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:30.100707 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" event={"ID":"44386569-af8e-4972-bf54-4134fe78214e","Type":"ContainerStarted","Data":"74d987d18b30a9efcc13241a4e48406a453b60c2176659d238084167079e59b6"} Apr 17 16:35:31.157009 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:31.156979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:35:31.159101 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:31.159081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b57612e9-f335-4d71-bdba-f06f0735eee1-metrics-certs\") pod \"network-metrics-daemon-42tgv\" (UID: \"b57612e9-f335-4d71-bdba-f06f0735eee1\") " pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:35:31.180620 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:31.180597 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xlgkd\"" Apr 17 16:35:31.188758 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:31.188725 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-42tgv" Apr 17 16:35:31.684114 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:31.684089 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-42tgv"] Apr 17 16:35:31.687066 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:35:31.687038 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb57612e9_f335_4d71_bdba_f06f0735eee1.slice/crio-e46b72c53853add517e5e5242ed91e04409f86014c56152a047929048e0f6904 WatchSource:0}: Error finding container e46b72c53853add517e5e5242ed91e04409f86014c56152a047929048e0f6904: Status 404 returned error can't find the container with id e46b72c53853add517e5e5242ed91e04409f86014c56152a047929048e0f6904 Apr 17 16:35:32.107629 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:32.107598 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" event={"ID":"44386569-af8e-4972-bf54-4134fe78214e","Type":"ContainerStarted","Data":"b1f2e5d23c3801d4e24918ed59892859bdd7be19eefd2f1eefe66ec37b5181d0"} Apr 17 16:35:32.108700 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:32.108678 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-42tgv" event={"ID":"b57612e9-f335-4d71-bdba-f06f0735eee1","Type":"ContainerStarted","Data":"e46b72c53853add517e5e5242ed91e04409f86014c56152a047929048e0f6904"} Apr 17 16:35:33.114978 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.114942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" event={"ID":"44386569-af8e-4972-bf54-4134fe78214e","Type":"ContainerStarted","Data":"0655e5529fb263f44bf8e50a18118f3415fb43f15bbaab35ded0f72aaed5a16a"} Apr 17 16:35:33.115380 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.114987 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" event={"ID":"44386569-af8e-4972-bf54-4134fe78214e","Type":"ContainerStarted","Data":"1214232c11409b6c249bdf5d922f81f7a65a725f3d75f481ea75a3175a650bb0"} Apr 17 16:35:33.119160 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.119133 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-42tgv" event={"ID":"b57612e9-f335-4d71-bdba-f06f0735eee1","Type":"ContainerStarted","Data":"d5bfd5e700547f8730adc1007fdbf8b3c9fb99ae8b18826679040901da3e364c"} Apr 17 16:35:33.140278 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.140227 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-54dd8cc7bb-rwpx2" podStartSLOduration=1.061279611 podStartE2EDuration="4.140210213s" podCreationTimestamp="2026-04-17 16:35:29 +0000 UTC" firstStartedPulling="2026-04-17 16:35:29.879481501 +0000 UTC m=+251.102809264" lastFinishedPulling="2026-04-17 16:35:32.958412109 +0000 UTC m=+254.181739866" observedRunningTime="2026-04-17 16:35:33.139252832 +0000 UTC m=+254.362580611" watchObservedRunningTime="2026-04-17 16:35:33.140210213 +0000 UTC m=+254.363537991" Apr 17 16:35:33.831668 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.831636 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-64cfbd95d4-7zv9g"] Apr 17 16:35:33.834797 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.834782 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:33.838678 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.838656 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:35:33.838883 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.838869 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-fl6zn\"" Apr 17 16:35:33.839200 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.839178 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:35:33.839706 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.839690 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:35:33.840254 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.840234 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:35:33.840544 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.840530 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:35:33.850872 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.850850 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64cfbd95d4-7zv9g"] Apr 17 16:35:33.861841 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.861812 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:35:33.982574 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.982533 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-oauth-config\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:33.982710 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.982593 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-config\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:33.982710 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.982662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-service-ca\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:33.982710 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.982690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-trusted-ca-bundle\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:33.982864 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.982715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-oauth-serving-cert\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:33.982864 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.982790 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h6z6\" (UniqueName: \"kubernetes.io/projected/aaecd0d5-9245-449d-b5d0-0b560a7008d3-kube-api-access-9h6z6\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:33.982864 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:33.982815 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-serving-cert\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.083398 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.083329 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-oauth-serving-cert\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.083398 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.083372 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9h6z6\" (UniqueName: \"kubernetes.io/projected/aaecd0d5-9245-449d-b5d0-0b560a7008d3-kube-api-access-9h6z6\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.083398 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.083395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-serving-cert\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.083628 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.083431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-oauth-config\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.083628 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.083452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-config\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.083628 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.083469 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-service-ca\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.083628 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.083488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-trusted-ca-bundle\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.084186 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.084158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-oauth-serving-cert\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.084284 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.084205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-service-ca\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.084284 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.084216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-config\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.084454 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.084292 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-trusted-ca-bundle\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.086000 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.085971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-serving-cert\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.086097 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.085975 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-oauth-config\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.091626 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.091606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h6z6\" (UniqueName: \"kubernetes.io/projected/aaecd0d5-9245-449d-b5d0-0b560a7008d3-kube-api-access-9h6z6\") pod \"console-64cfbd95d4-7zv9g\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.123763 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.123715 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-42tgv" event={"ID":"b57612e9-f335-4d71-bdba-f06f0735eee1","Type":"ContainerStarted","Data":"603a13a9c86e6eded94bdcd21dd48a885f33b0886e754a936598e4b90a1eb924"} Apr 17 16:35:34.142805 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.142771 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-42tgv" podStartSLOduration=253.87496371 podStartE2EDuration="4m15.142759096s" podCreationTimestamp="2026-04-17 16:31:19 +0000 UTC" firstStartedPulling="2026-04-17 16:35:31.688773353 +0000 UTC m=+252.912101110" lastFinishedPulling="2026-04-17 16:35:32.956568725 +0000 UTC m=+254.179896496" observedRunningTime="2026-04-17 16:35:34.140859588 +0000 UTC m=+255.364187364" watchObservedRunningTime="2026-04-17 16:35:34.142759096 +0000 UTC m=+255.366086873" Apr 17 16:35:34.144050 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.144025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:34.265233 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:34.265204 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64cfbd95d4-7zv9g"] Apr 17 16:35:34.268413 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:35:34.268381 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaecd0d5_9245_449d_b5d0_0b560a7008d3.slice/crio-1155914b003013aff1edcf9b3e4c5419399ec4dab200bff0abc2e79aab20ca98 WatchSource:0}: Error finding container 1155914b003013aff1edcf9b3e4c5419399ec4dab200bff0abc2e79aab20ca98: Status 404 returned error can't find the container with id 1155914b003013aff1edcf9b3e4c5419399ec4dab200bff0abc2e79aab20ca98 Apr 17 16:35:35.132162 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:35.132125 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64cfbd95d4-7zv9g" event={"ID":"aaecd0d5-9245-449d-b5d0-0b560a7008d3","Type":"ContainerStarted","Data":"59a70ee74f7b5b5dff48b7dc7283655a6c13a9cc2abd1699f7e5ee9d7d387dc8"} Apr 17 16:35:35.132162 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:35.132164 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64cfbd95d4-7zv9g" event={"ID":"aaecd0d5-9245-449d-b5d0-0b560a7008d3","Type":"ContainerStarted","Data":"1155914b003013aff1edcf9b3e4c5419399ec4dab200bff0abc2e79aab20ca98"} Apr 17 16:35:44.144608 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:44.144552 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:44.144608 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:44.144614 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:44.149274 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:44.149251 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:44.159616 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:44.159595 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:35:44.168340 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:35:44.168301 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64cfbd95d4-7zv9g" podStartSLOduration=11.16828982 podStartE2EDuration="11.16828982s" podCreationTimestamp="2026-04-17 16:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:35:35.161050199 +0000 UTC m=+256.384377977" watchObservedRunningTime="2026-04-17 16:35:44.16828982 +0000 UTC m=+265.391617598" Apr 17 16:36:19.256390 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:36:19.256359 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:36:19.256390 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:36:19.256375 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:36:19.259581 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:36:19.259564 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:37:05.208024 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:05.207992 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64cfbd95d4-7zv9g"] Apr 17 16:37:30.230612 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.230528 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-64cfbd95d4-7zv9g" podUID="aaecd0d5-9245-449d-b5d0-0b560a7008d3" containerName="console" containerID="cri-o://59a70ee74f7b5b5dff48b7dc7283655a6c13a9cc2abd1699f7e5ee9d7d387dc8" gracePeriod=15 Apr 17 16:37:30.440813 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.440785 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64cfbd95d4-7zv9g_aaecd0d5-9245-449d-b5d0-0b560a7008d3/console/0.log" Apr 17 16:37:30.440956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.440823 2574 generic.go:358] "Generic (PLEG): container finished" podID="aaecd0d5-9245-449d-b5d0-0b560a7008d3" containerID="59a70ee74f7b5b5dff48b7dc7283655a6c13a9cc2abd1699f7e5ee9d7d387dc8" exitCode=2 Apr 17 16:37:30.440956 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.440857 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64cfbd95d4-7zv9g" event={"ID":"aaecd0d5-9245-449d-b5d0-0b560a7008d3","Type":"ContainerDied","Data":"59a70ee74f7b5b5dff48b7dc7283655a6c13a9cc2abd1699f7e5ee9d7d387dc8"} Apr 17 16:37:30.459625 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.459605 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64cfbd95d4-7zv9g_aaecd0d5-9245-449d-b5d0-0b560a7008d3/console/0.log" Apr 17 16:37:30.459724 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.459662 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:37:30.512699 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.512638 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-oauth-config\") pod \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " Apr 17 16:37:30.512699 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.512665 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-serving-cert\") pod \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " Apr 17 16:37:30.514630 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.514598 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aaecd0d5-9245-449d-b5d0-0b560a7008d3" (UID: "aaecd0d5-9245-449d-b5d0-0b560a7008d3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:37:30.514754 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.514645 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aaecd0d5-9245-449d-b5d0-0b560a7008d3" (UID: "aaecd0d5-9245-449d-b5d0-0b560a7008d3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:37:30.613075 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613051 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-service-ca\") pod \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " Apr 17 16:37:30.613180 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613090 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-config\") pod \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " Apr 17 16:37:30.613180 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613111 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h6z6\" (UniqueName: \"kubernetes.io/projected/aaecd0d5-9245-449d-b5d0-0b560a7008d3-kube-api-access-9h6z6\") pod \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " Apr 17 16:37:30.613284 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613247 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-trusted-ca-bundle\") pod \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " Apr 17 16:37:30.613335 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613311 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-oauth-serving-cert\") pod \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\" (UID: \"aaecd0d5-9245-449d-b5d0-0b560a7008d3\") " Apr 17 16:37:30.613404 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613380 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-service-ca" (OuterVolumeSpecName: "service-ca") pod "aaecd0d5-9245-449d-b5d0-0b560a7008d3" (UID: "aaecd0d5-9245-449d-b5d0-0b560a7008d3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:37:30.613523 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613401 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-config" (OuterVolumeSpecName: "console-config") pod "aaecd0d5-9245-449d-b5d0-0b560a7008d3" (UID: "aaecd0d5-9245-449d-b5d0-0b560a7008d3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:37:30.613594 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613576 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aaecd0d5-9245-449d-b5d0-0b560a7008d3" (UID: "aaecd0d5-9245-449d-b5d0-0b560a7008d3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:37:30.613637 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613620 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aaecd0d5-9245-449d-b5d0-0b560a7008d3" (UID: "aaecd0d5-9245-449d-b5d0-0b560a7008d3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:37:30.613672 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613638 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-trusted-ca-bundle\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:37:30.613672 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613649 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-oauth-config\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:37:30.613672 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613661 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-serving-cert\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:37:30.613672 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613671 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-service-ca\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:37:30.613808 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.613680 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-console-config\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:37:30.614977 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.614962 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaecd0d5-9245-449d-b5d0-0b560a7008d3-kube-api-access-9h6z6" (OuterVolumeSpecName: "kube-api-access-9h6z6") pod "aaecd0d5-9245-449d-b5d0-0b560a7008d3" (UID: "aaecd0d5-9245-449d-b5d0-0b560a7008d3"). InnerVolumeSpecName "kube-api-access-9h6z6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:37:30.714554 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.714514 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9h6z6\" (UniqueName: \"kubernetes.io/projected/aaecd0d5-9245-449d-b5d0-0b560a7008d3-kube-api-access-9h6z6\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:37:30.714554 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:30.714550 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aaecd0d5-9245-449d-b5d0-0b560a7008d3-oauth-serving-cert\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:37:31.444655 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:31.444628 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64cfbd95d4-7zv9g_aaecd0d5-9245-449d-b5d0-0b560a7008d3/console/0.log" Apr 17 16:37:31.445107 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:31.444708 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64cfbd95d4-7zv9g" event={"ID":"aaecd0d5-9245-449d-b5d0-0b560a7008d3","Type":"ContainerDied","Data":"1155914b003013aff1edcf9b3e4c5419399ec4dab200bff0abc2e79aab20ca98"} Apr 17 16:37:31.445107 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:31.444755 2574 scope.go:117] "RemoveContainer" containerID="59a70ee74f7b5b5dff48b7dc7283655a6c13a9cc2abd1699f7e5ee9d7d387dc8" Apr 17 16:37:31.445107 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:31.444762 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64cfbd95d4-7zv9g" Apr 17 16:37:31.470002 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:31.469941 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64cfbd95d4-7zv9g"] Apr 17 16:37:31.473774 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:31.473754 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64cfbd95d4-7zv9g"] Apr 17 16:37:33.381581 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:33.381546 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaecd0d5-9245-449d-b5d0-0b560a7008d3" path="/var/lib/kubelet/pods/aaecd0d5-9245-449d-b5d0-0b560a7008d3/volumes" Apr 17 16:37:52.679543 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.679508 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx"] Apr 17 16:37:52.679935 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.679794 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaecd0d5-9245-449d-b5d0-0b560a7008d3" containerName="console" Apr 17 16:37:52.679935 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.679806 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaecd0d5-9245-449d-b5d0-0b560a7008d3" containerName="console" Apr 17 16:37:52.679935 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.679858 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="aaecd0d5-9245-449d-b5d0-0b560a7008d3" containerName="console" Apr 17 16:37:52.681652 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.681637 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" Apr 17 16:37:52.685074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.685049 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 16:37:52.685074 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.685070 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 16:37:52.686099 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.686081 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 16:37:52.686099 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.686090 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 16:37:52.693522 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.693503 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx"] Apr 17 16:37:52.775449 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.775421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rd46\" (UniqueName: \"kubernetes.io/projected/2f066091-ec6a-49df-b7a0-64a92f5db1c3-kube-api-access-6rd46\") pod \"managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx\" (UID: \"2f066091-ec6a-49df-b7a0-64a92f5db1c3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" Apr 17 16:37:52.775595 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.775467 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2f066091-ec6a-49df-b7a0-64a92f5db1c3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx\" (UID: \"2f066091-ec6a-49df-b7a0-64a92f5db1c3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" Apr 17 16:37:52.876361 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.876328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rd46\" (UniqueName: \"kubernetes.io/projected/2f066091-ec6a-49df-b7a0-64a92f5db1c3-kube-api-access-6rd46\") pod \"managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx\" (UID: \"2f066091-ec6a-49df-b7a0-64a92f5db1c3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" Apr 17 16:37:52.876527 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.876382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2f066091-ec6a-49df-b7a0-64a92f5db1c3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx\" (UID: \"2f066091-ec6a-49df-b7a0-64a92f5db1c3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" Apr 17 16:37:52.878790 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.878762 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/2f066091-ec6a-49df-b7a0-64a92f5db1c3-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx\" (UID: \"2f066091-ec6a-49df-b7a0-64a92f5db1c3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" Apr 17 16:37:52.896498 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:52.896470 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rd46\" (UniqueName: \"kubernetes.io/projected/2f066091-ec6a-49df-b7a0-64a92f5db1c3-kube-api-access-6rd46\") pod \"managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx\" (UID: \"2f066091-ec6a-49df-b7a0-64a92f5db1c3\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" Apr 17 16:37:53.001789 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:53.001695 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" Apr 17 16:37:53.123000 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:53.122967 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx"] Apr 17 16:37:53.127226 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:37:53.127191 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f066091_ec6a_49df_b7a0_64a92f5db1c3.slice/crio-3021bf89de7a8d16763d1bbf2eeb1288c8e270677d3047d1150ef68637227802 WatchSource:0}: Error finding container 3021bf89de7a8d16763d1bbf2eeb1288c8e270677d3047d1150ef68637227802: Status 404 returned error can't find the container with id 3021bf89de7a8d16763d1bbf2eeb1288c8e270677d3047d1150ef68637227802 Apr 17 16:37:53.128787 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:53.128770 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:37:53.503534 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:53.503502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" event={"ID":"2f066091-ec6a-49df-b7a0-64a92f5db1c3","Type":"ContainerStarted","Data":"3021bf89de7a8d16763d1bbf2eeb1288c8e270677d3047d1150ef68637227802"} Apr 17 16:37:56.517614 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:56.517585 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" event={"ID":"2f066091-ec6a-49df-b7a0-64a92f5db1c3","Type":"ContainerStarted","Data":"18ebca39e62df23dda277e2588c0c06b8cb33d2b6717066d8b2699362b53aa39"} Apr 17 16:37:56.537161 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:37:56.537116 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-59f98cb8f9-5dvfx" podStartSLOduration=2.193593281 podStartE2EDuration="4.537102436s" podCreationTimestamp="2026-04-17 16:37:52 +0000 UTC" firstStartedPulling="2026-04-17 16:37:53.128918167 +0000 UTC m=+394.352245923" lastFinishedPulling="2026-04-17 16:37:55.472427318 +0000 UTC m=+396.695755078" observedRunningTime="2026-04-17 16:37:56.536366637 +0000 UTC m=+397.759694417" watchObservedRunningTime="2026-04-17 16:37:56.537102436 +0000 UTC m=+397.760430219" Apr 17 16:38:20.095364 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.095326 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw"] Apr 17 16:38:20.105024 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.104989 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.106545 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.106517 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw"] Apr 17 16:38:20.107657 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.107639 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 16:38:20.108678 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.108631 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-84bv6\"" Apr 17 16:38:20.108678 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.108647 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 16:38:20.164629 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.164595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.164829 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.164643 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.164829 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.164691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcrr6\" (UniqueName: \"kubernetes.io/projected/54579051-fea0-409c-a527-34635a1a8529-kube-api-access-dcrr6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.265749 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.265710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.265898 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.265780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.265898 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.265805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcrr6\" (UniqueName: \"kubernetes.io/projected/54579051-fea0-409c-a527-34635a1a8529-kube-api-access-dcrr6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.266091 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.266074 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.266192 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.266158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.276294 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.276260 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcrr6\" (UniqueName: \"kubernetes.io/projected/54579051-fea0-409c-a527-34635a1a8529-kube-api-access-dcrr6\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.415177 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.415088 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:20.546364 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.546328 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw"] Apr 17 16:38:20.549286 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:38:20.549257 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54579051_fea0_409c_a527_34635a1a8529.slice/crio-a443dae243f2a3ed873bb37a14bc7329b4f62e5370e958019102d1aa7679076f WatchSource:0}: Error finding container a443dae243f2a3ed873bb37a14bc7329b4f62e5370e958019102d1aa7679076f: Status 404 returned error can't find the container with id a443dae243f2a3ed873bb37a14bc7329b4f62e5370e958019102d1aa7679076f Apr 17 16:38:20.582674 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:20.582640 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" event={"ID":"54579051-fea0-409c-a527-34635a1a8529","Type":"ContainerStarted","Data":"a443dae243f2a3ed873bb37a14bc7329b4f62e5370e958019102d1aa7679076f"} Apr 17 16:38:26.601877 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:26.601838 2574 generic.go:358] "Generic (PLEG): container finished" podID="54579051-fea0-409c-a527-34635a1a8529" containerID="bc9976201325dbc1992a19c9c6788a36c3451a27983f2974e755edccb5e344db" exitCode=0 Apr 17 16:38:26.602239 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:26.601884 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" event={"ID":"54579051-fea0-409c-a527-34635a1a8529","Type":"ContainerDied","Data":"bc9976201325dbc1992a19c9c6788a36c3451a27983f2974e755edccb5e344db"} Apr 17 16:38:28.609434 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:28.609398 2574 generic.go:358] "Generic (PLEG): container finished" podID="54579051-fea0-409c-a527-34635a1a8529" containerID="8857781cbbf8a6bae52e8db35274b221180dd67929a97ead4cefbbc64ab2a1fa" exitCode=0 Apr 17 16:38:28.609782 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:28.609473 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" event={"ID":"54579051-fea0-409c-a527-34635a1a8529","Type":"ContainerDied","Data":"8857781cbbf8a6bae52e8db35274b221180dd67929a97ead4cefbbc64ab2a1fa"} Apr 17 16:38:35.632117 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:35.632076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" event={"ID":"54579051-fea0-409c-a527-34635a1a8529","Type":"ContainerStarted","Data":"a6d83c79ad1203d73ce6a1967901920fabc0519dd20c73ccbfc0367d624f7ce9"} Apr 17 16:38:35.653158 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:35.653076 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" podStartSLOduration=0.676380062 podStartE2EDuration="15.653061642s" podCreationTimestamp="2026-04-17 16:38:20 +0000 UTC" firstStartedPulling="2026-04-17 16:38:20.551358574 +0000 UTC m=+421.774686333" lastFinishedPulling="2026-04-17 16:38:35.528040148 +0000 UTC m=+436.751367913" observedRunningTime="2026-04-17 16:38:35.651832076 +0000 UTC m=+436.875159855" watchObservedRunningTime="2026-04-17 16:38:35.653061642 +0000 UTC m=+436.876389420" Apr 17 16:38:36.637043 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:36.637009 2574 generic.go:358] "Generic (PLEG): container finished" podID="54579051-fea0-409c-a527-34635a1a8529" containerID="a6d83c79ad1203d73ce6a1967901920fabc0519dd20c73ccbfc0367d624f7ce9" exitCode=0 Apr 17 16:38:36.637404 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:36.637092 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" event={"ID":"54579051-fea0-409c-a527-34635a1a8529","Type":"ContainerDied","Data":"a6d83c79ad1203d73ce6a1967901920fabc0519dd20c73ccbfc0367d624f7ce9"} Apr 17 16:38:37.752475 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:37.752450 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:37.907565 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:37.907465 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-util\") pod \"54579051-fea0-409c-a527-34635a1a8529\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " Apr 17 16:38:37.907565 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:37.907552 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-bundle\") pod \"54579051-fea0-409c-a527-34635a1a8529\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " Apr 17 16:38:37.907840 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:37.907602 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcrr6\" (UniqueName: \"kubernetes.io/projected/54579051-fea0-409c-a527-34635a1a8529-kube-api-access-dcrr6\") pod \"54579051-fea0-409c-a527-34635a1a8529\" (UID: \"54579051-fea0-409c-a527-34635a1a8529\") " Apr 17 16:38:37.908117 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:37.908092 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-bundle" (OuterVolumeSpecName: "bundle") pod "54579051-fea0-409c-a527-34635a1a8529" (UID: "54579051-fea0-409c-a527-34635a1a8529"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:38:37.909878 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:37.909855 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54579051-fea0-409c-a527-34635a1a8529-kube-api-access-dcrr6" (OuterVolumeSpecName: "kube-api-access-dcrr6") pod "54579051-fea0-409c-a527-34635a1a8529" (UID: "54579051-fea0-409c-a527-34635a1a8529"). InnerVolumeSpecName "kube-api-access-dcrr6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:38:37.911600 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:37.911577 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-util" (OuterVolumeSpecName: "util") pod "54579051-fea0-409c-a527-34635a1a8529" (UID: "54579051-fea0-409c-a527-34635a1a8529"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:38:38.008906 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:38.008874 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-bundle\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:38:38.008906 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:38.008901 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dcrr6\" (UniqueName: \"kubernetes.io/projected/54579051-fea0-409c-a527-34635a1a8529-kube-api-access-dcrr6\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:38:38.008906 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:38.008914 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/54579051-fea0-409c-a527-34635a1a8529-util\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:38:38.644002 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:38.643967 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" event={"ID":"54579051-fea0-409c-a527-34635a1a8529","Type":"ContainerDied","Data":"a443dae243f2a3ed873bb37a14bc7329b4f62e5370e958019102d1aa7679076f"} Apr 17 16:38:38.644002 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:38.643998 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a443dae243f2a3ed873bb37a14bc7329b4f62e5370e958019102d1aa7679076f" Apr 17 16:38:38.644210 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:38.644021 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ckdhmw" Apr 17 16:38:42.449925 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.449886 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls"] Apr 17 16:38:42.450381 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.450263 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54579051-fea0-409c-a527-34635a1a8529" containerName="extract" Apr 17 16:38:42.450381 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.450280 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="54579051-fea0-409c-a527-34635a1a8529" containerName="extract" Apr 17 16:38:42.450381 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.450301 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54579051-fea0-409c-a527-34635a1a8529" containerName="pull" Apr 17 16:38:42.450381 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.450311 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="54579051-fea0-409c-a527-34635a1a8529" containerName="pull" Apr 17 16:38:42.450381 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.450329 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54579051-fea0-409c-a527-34635a1a8529" containerName="util" Apr 17 16:38:42.450381 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.450337 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="54579051-fea0-409c-a527-34635a1a8529" containerName="util" Apr 17 16:38:42.450662 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.450410 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="54579051-fea0-409c-a527-34635a1a8529" containerName="extract" Apr 17 16:38:42.502931 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.502899 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls"] Apr 17 16:38:42.503103 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.503015 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" Apr 17 16:38:42.506938 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.506911 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 16:38:42.507098 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.506954 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 16:38:42.507098 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.506980 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-89gcf\"" Apr 17 16:38:42.507098 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.507046 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 16:38:42.642347 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.642315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/db361a7c-3af5-4887-857f-0797bf4729cc-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7ttls\" (UID: \"db361a7c-3af5-4887-857f-0797bf4729cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" Apr 17 16:38:42.642510 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.642368 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsc9k\" (UniqueName: \"kubernetes.io/projected/db361a7c-3af5-4887-857f-0797bf4729cc-kube-api-access-dsc9k\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7ttls\" (UID: \"db361a7c-3af5-4887-857f-0797bf4729cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" Apr 17 16:38:42.743387 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.743310 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/db361a7c-3af5-4887-857f-0797bf4729cc-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7ttls\" (UID: \"db361a7c-3af5-4887-857f-0797bf4729cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" Apr 17 16:38:42.743387 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.743364 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsc9k\" (UniqueName: \"kubernetes.io/projected/db361a7c-3af5-4887-857f-0797bf4729cc-kube-api-access-dsc9k\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7ttls\" (UID: \"db361a7c-3af5-4887-857f-0797bf4729cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" Apr 17 16:38:42.745661 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.745639 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/db361a7c-3af5-4887-857f-0797bf4729cc-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7ttls\" (UID: \"db361a7c-3af5-4887-857f-0797bf4729cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" Apr 17 16:38:42.764957 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.764924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsc9k\" (UniqueName: \"kubernetes.io/projected/db361a7c-3af5-4887-857f-0797bf4729cc-kube-api-access-dsc9k\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-7ttls\" (UID: \"db361a7c-3af5-4887-857f-0797bf4729cc\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" Apr 17 16:38:42.812759 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.812719 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" Apr 17 16:38:42.945304 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:42.945275 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls"] Apr 17 16:38:42.948443 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:38:42.948412 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb361a7c_3af5_4887_857f_0797bf4729cc.slice/crio-1ae5938f091c2c3b4135df8b353e5f0b6b2aa8d3da16148819b5934c18514cf7 WatchSource:0}: Error finding container 1ae5938f091c2c3b4135df8b353e5f0b6b2aa8d3da16148819b5934c18514cf7: Status 404 returned error can't find the container with id 1ae5938f091c2c3b4135df8b353e5f0b6b2aa8d3da16148819b5934c18514cf7 Apr 17 16:38:43.662828 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:43.662777 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" event={"ID":"db361a7c-3af5-4887-857f-0797bf4729cc","Type":"ContainerStarted","Data":"1ae5938f091c2c3b4135df8b353e5f0b6b2aa8d3da16148819b5934c18514cf7"} Apr 17 16:38:46.673746 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:46.673629 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" event={"ID":"db361a7c-3af5-4887-857f-0797bf4729cc","Type":"ContainerStarted","Data":"f1193573718430422a8a04cda30ba90e3dab6b08a9a27d46e48518210a1caedf"} Apr 17 16:38:46.674204 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:46.673777 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" Apr 17 16:38:46.696342 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:46.696297 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" podStartSLOduration=1.239970854 podStartE2EDuration="4.696284473s" podCreationTimestamp="2026-04-17 16:38:42 +0000 UTC" firstStartedPulling="2026-04-17 16:38:42.950132033 +0000 UTC m=+444.173459792" lastFinishedPulling="2026-04-17 16:38:46.40644565 +0000 UTC m=+447.629773411" observedRunningTime="2026-04-17 16:38:46.694179776 +0000 UTC m=+447.917507568" watchObservedRunningTime="2026-04-17 16:38:46.696284473 +0000 UTC m=+447.919612248" Apr 17 16:38:46.934475 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:46.934387 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b4lpf"] Apr 17 16:38:46.937612 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:46.937587 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:46.941826 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:46.941807 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 16:38:46.941826 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:46.941818 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 16:38:46.941976 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:46.941903 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-frp84\"" Apr 17 16:38:46.952595 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:46.952572 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b4lpf"] Apr 17 16:38:47.082887 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.082853 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/151d5585-c63e-46bd-840a-222a14ad1b36-cabundle0\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:47.082887 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.082890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkjtd\" (UniqueName: \"kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-kube-api-access-jkjtd\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:47.083089 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.082975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:47.183301 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.183271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:47.183491 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.183318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/151d5585-c63e-46bd-840a-222a14ad1b36-cabundle0\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:47.183491 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.183337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkjtd\" (UniqueName: \"kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-kube-api-access-jkjtd\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:47.183491 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.183434 2574 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 17 16:38:47.183491 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.183456 2574 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:38:47.183491 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.183463 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:38:47.183491 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.183475 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b4lpf: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 16:38:47.183830 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.183532 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates podName:151d5585-c63e-46bd-840a-222a14ad1b36 nodeName:}" failed. No retries permitted until 2026-04-17 16:38:47.683515138 +0000 UTC m=+448.906842896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates") pod "keda-operator-ffbb595cb-b4lpf" (UID: "151d5585-c63e-46bd-840a-222a14ad1b36") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 17 16:38:47.184082 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.184061 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/151d5585-c63e-46bd-840a-222a14ad1b36-cabundle0\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:47.200004 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.199947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkjtd\" (UniqueName: \"kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-kube-api-access-jkjtd\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:47.410360 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.410329 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p"] Apr 17 16:38:47.413522 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.413505 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:47.429912 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.429892 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 17 16:38:47.460804 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.460719 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p"] Apr 17 16:38:47.586328 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.586293 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:47.586328 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.586333 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:47.586556 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.586371 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krcr8\" (UniqueName: \"kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-kube-api-access-krcr8\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:47.622452 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.622417 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-f4gvn"] Apr 17 16:38:47.625630 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.625613 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-f4gvn" Apr 17 16:38:47.629373 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.629350 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 16:38:47.650788 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.650759 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-f4gvn"] Apr 17 16:38:47.687431 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.687397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:47.687431 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.687432 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.687453 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.687489 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krcr8\" (UniqueName: \"kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-kube-api-access-krcr8\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.687557 2574 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.687579 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.687590 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b4lpf: references non-existent secret key: ca.crt Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.687599 2574 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.687616 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.687634 2574 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.687654 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.687654 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates podName:151d5585-c63e-46bd-840a-222a14ad1b36 nodeName:}" failed. No retries permitted until 2026-04-17 16:38:48.687633562 +0000 UTC m=+449.910961322 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates") pod "keda-operator-ffbb595cb-b4lpf" (UID: "151d5585-c63e-46bd-840a-222a14ad1b36") : references non-existent secret key: ca.crt Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:47.687722 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates podName:0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:38:48.187702284 +0000 UTC m=+449.411030041 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates") pod "keda-metrics-apiserver-7c9f485588-x9f2p" (UID: "0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 17 16:38:47.687845 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.687844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:47.711220 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.711150 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krcr8\" (UniqueName: \"kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-kube-api-access-krcr8\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:47.788399 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.788359 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr7hx\" (UniqueName: \"kubernetes.io/projected/ca75d58c-0feb-4520-ae8a-65cc6c3f6d67-kube-api-access-wr7hx\") pod \"keda-admission-cf49989db-f4gvn\" (UID: \"ca75d58c-0feb-4520-ae8a-65cc6c3f6d67\") " pod="openshift-keda/keda-admission-cf49989db-f4gvn" Apr 17 16:38:47.788399 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.788405 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ca75d58c-0feb-4520-ae8a-65cc6c3f6d67-certificates\") pod \"keda-admission-cf49989db-f4gvn\" (UID: \"ca75d58c-0feb-4520-ae8a-65cc6c3f6d67\") " pod="openshift-keda/keda-admission-cf49989db-f4gvn" Apr 17 16:38:47.889059 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.889021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr7hx\" (UniqueName: \"kubernetes.io/projected/ca75d58c-0feb-4520-ae8a-65cc6c3f6d67-kube-api-access-wr7hx\") pod \"keda-admission-cf49989db-f4gvn\" (UID: \"ca75d58c-0feb-4520-ae8a-65cc6c3f6d67\") " pod="openshift-keda/keda-admission-cf49989db-f4gvn" Apr 17 16:38:47.889059 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.889064 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ca75d58c-0feb-4520-ae8a-65cc6c3f6d67-certificates\") pod \"keda-admission-cf49989db-f4gvn\" (UID: \"ca75d58c-0feb-4520-ae8a-65cc6c3f6d67\") " pod="openshift-keda/keda-admission-cf49989db-f4gvn" Apr 17 16:38:47.891460 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.891435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/ca75d58c-0feb-4520-ae8a-65cc6c3f6d67-certificates\") pod \"keda-admission-cf49989db-f4gvn\" (UID: \"ca75d58c-0feb-4520-ae8a-65cc6c3f6d67\") " pod="openshift-keda/keda-admission-cf49989db-f4gvn" Apr 17 16:38:47.904297 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.904264 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr7hx\" (UniqueName: \"kubernetes.io/projected/ca75d58c-0feb-4520-ae8a-65cc6c3f6d67-kube-api-access-wr7hx\") pod \"keda-admission-cf49989db-f4gvn\" (UID: \"ca75d58c-0feb-4520-ae8a-65cc6c3f6d67\") " pod="openshift-keda/keda-admission-cf49989db-f4gvn" Apr 17 16:38:47.935322 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:47.935291 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-f4gvn" Apr 17 16:38:48.073309 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:48.073275 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-f4gvn"] Apr 17 16:38:48.077477 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:38:48.077448 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca75d58c_0feb_4520_ae8a_65cc6c3f6d67.slice/crio-0b4829ca78be5c74e5142115736c0271cb3259d55a9c25fe09a76f17b685caa8 WatchSource:0}: Error finding container 0b4829ca78be5c74e5142115736c0271cb3259d55a9c25fe09a76f17b685caa8: Status 404 returned error can't find the container with id 0b4829ca78be5c74e5142115736c0271cb3259d55a9c25fe09a76f17b685caa8 Apr 17 16:38:48.191492 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:48.191460 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:48.191695 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:48.191620 2574 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:38:48.191695 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:48.191641 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:38:48.191695 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:48.191666 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p: references non-existent secret key: tls.crt Apr 17 16:38:48.191886 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:48.191753 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates podName:0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:38:49.191709083 +0000 UTC m=+450.415036843 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates") pod "keda-metrics-apiserver-7c9f485588-x9f2p" (UID: "0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3") : references non-existent secret key: tls.crt Apr 17 16:38:48.681397 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:48.681366 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-f4gvn" event={"ID":"ca75d58c-0feb-4520-ae8a-65cc6c3f6d67","Type":"ContainerStarted","Data":"0b4829ca78be5c74e5142115736c0271cb3259d55a9c25fe09a76f17b685caa8"} Apr 17 16:38:48.695901 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:48.695871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:48.696260 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:48.695982 2574 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:38:48.696260 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:48.695994 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:38:48.696260 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:48.696003 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b4lpf: references non-existent secret key: ca.crt Apr 17 16:38:48.696260 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:48.696048 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates podName:151d5585-c63e-46bd-840a-222a14ad1b36 nodeName:}" failed. No retries permitted until 2026-04-17 16:38:50.696035793 +0000 UTC m=+451.919363548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates") pod "keda-operator-ffbb595cb-b4lpf" (UID: "151d5585-c63e-46bd-840a-222a14ad1b36") : references non-existent secret key: ca.crt Apr 17 16:38:49.201056 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:49.201019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:49.201234 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:49.201158 2574 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:38:49.201234 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:49.201179 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:38:49.201234 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:49.201197 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p: references non-existent secret key: tls.crt Apr 17 16:38:49.201333 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:49.201250 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates podName:0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:38:51.201235554 +0000 UTC m=+452.424563309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates") pod "keda-metrics-apiserver-7c9f485588-x9f2p" (UID: "0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3") : references non-existent secret key: tls.crt Apr 17 16:38:49.685197 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:49.685155 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-f4gvn" event={"ID":"ca75d58c-0feb-4520-ae8a-65cc6c3f6d67","Type":"ContainerStarted","Data":"2e9927009a5b1961bb8aec4967f9cad28cedc954ecabbecfd72ad0885b7861e9"} Apr 17 16:38:49.685362 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:49.685296 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-f4gvn" Apr 17 16:38:49.713121 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:49.713066 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-f4gvn" podStartSLOduration=1.235302706 podStartE2EDuration="2.713052672s" podCreationTimestamp="2026-04-17 16:38:47 +0000 UTC" firstStartedPulling="2026-04-17 16:38:48.078638531 +0000 UTC m=+449.301966288" lastFinishedPulling="2026-04-17 16:38:49.556388491 +0000 UTC m=+450.779716254" observedRunningTime="2026-04-17 16:38:49.712269715 +0000 UTC m=+450.935597494" watchObservedRunningTime="2026-04-17 16:38:49.713052672 +0000 UTC m=+450.936380449" Apr 17 16:38:50.713338 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:50.713298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:50.713717 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:50.713413 2574 secret.go:281] references non-existent secret key: ca.crt Apr 17 16:38:50.713717 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:50.713425 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 16:38:50.713717 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:50.713434 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-b4lpf: references non-existent secret key: ca.crt Apr 17 16:38:50.713717 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:50.713482 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates podName:151d5585-c63e-46bd-840a-222a14ad1b36 nodeName:}" failed. No retries permitted until 2026-04-17 16:38:54.713467077 +0000 UTC m=+455.936794835 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates") pod "keda-operator-ffbb595cb-b4lpf" (UID: "151d5585-c63e-46bd-840a-222a14ad1b36") : references non-existent secret key: ca.crt Apr 17 16:38:51.218000 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:51.217957 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:51.218188 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:51.218096 2574 secret.go:281] references non-existent secret key: tls.crt Apr 17 16:38:51.218188 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:51.218116 2574 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 17 16:38:51.218188 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:51.218133 2574 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p: references non-existent secret key: tls.crt Apr 17 16:38:51.218188 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:38:51.218184 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates podName:0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3 nodeName:}" failed. No retries permitted until 2026-04-17 16:38:55.218169498 +0000 UTC m=+456.441497257 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates") pod "keda-metrics-apiserver-7c9f485588-x9f2p" (UID: "0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3") : references non-existent secret key: tls.crt Apr 17 16:38:54.748474 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:54.748438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:54.750873 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:54.750852 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/151d5585-c63e-46bd-840a-222a14ad1b36-certificates\") pod \"keda-operator-ffbb595cb-b4lpf\" (UID: \"151d5585-c63e-46bd-840a-222a14ad1b36\") " pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:55.046670 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:55.046639 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:55.164857 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:55.164830 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-b4lpf"] Apr 17 16:38:55.167114 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:38:55.167086 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod151d5585_c63e_46bd_840a_222a14ad1b36.slice/crio-3a6590ad237a9b9df3cbba68aa61f03896097df58fc4ee36a9630e9daecbb58f WatchSource:0}: Error finding container 3a6590ad237a9b9df3cbba68aa61f03896097df58fc4ee36a9630e9daecbb58f: Status 404 returned error can't find the container with id 3a6590ad237a9b9df3cbba68aa61f03896097df58fc4ee36a9630e9daecbb58f Apr 17 16:38:55.251901 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:55.251865 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:55.254471 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:55.254437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-x9f2p\" (UID: \"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:55.523631 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:55.523606 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:55.643821 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:55.643779 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p"] Apr 17 16:38:55.646555 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:38:55.646532 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dd86cd5_49f7_4b0a_b6bb_069ea29e86f3.slice/crio-8d1728321b54458f846f0046583f456f4fbb52448f2fefc9697405c5aef51d21 WatchSource:0}: Error finding container 8d1728321b54458f846f0046583f456f4fbb52448f2fefc9697405c5aef51d21: Status 404 returned error can't find the container with id 8d1728321b54458f846f0046583f456f4fbb52448f2fefc9697405c5aef51d21 Apr 17 16:38:55.703204 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:55.703171 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" event={"ID":"151d5585-c63e-46bd-840a-222a14ad1b36","Type":"ContainerStarted","Data":"3a6590ad237a9b9df3cbba68aa61f03896097df58fc4ee36a9630e9daecbb58f"} Apr 17 16:38:55.704244 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:55.704223 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" event={"ID":"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3","Type":"ContainerStarted","Data":"8d1728321b54458f846f0046583f456f4fbb52448f2fefc9697405c5aef51d21"} Apr 17 16:38:59.722452 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:59.722412 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" event={"ID":"151d5585-c63e-46bd-840a-222a14ad1b36","Type":"ContainerStarted","Data":"09f2f76a16bbafceadcd2eb198873934ebc77a791217499a0694b7755a6ea576"} Apr 17 16:38:59.722948 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:59.722494 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:38:59.723687 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:59.723662 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" event={"ID":"0dd86cd5-49f7-4b0a-b6bb-069ea29e86f3","Type":"ContainerStarted","Data":"514e4318a9d2dd40196b2a01e4ac843103662e87a81027ce412a5543e86de98d"} Apr 17 16:38:59.723831 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:59.723802 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:38:59.740647 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:59.740606 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" podStartSLOduration=9.86302191 podStartE2EDuration="13.740593628s" podCreationTimestamp="2026-04-17 16:38:46 +0000 UTC" firstStartedPulling="2026-04-17 16:38:55.168318346 +0000 UTC m=+456.391646103" lastFinishedPulling="2026-04-17 16:38:59.045890051 +0000 UTC m=+460.269217821" observedRunningTime="2026-04-17 16:38:59.738892983 +0000 UTC m=+460.962220761" watchObservedRunningTime="2026-04-17 16:38:59.740593628 +0000 UTC m=+460.963921405" Apr 17 16:38:59.756456 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:38:59.756406 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" podStartSLOduration=9.363293562 podStartE2EDuration="12.756391322s" podCreationTimestamp="2026-04-17 16:38:47 +0000 UTC" firstStartedPulling="2026-04-17 16:38:55.647875694 +0000 UTC m=+456.871203450" lastFinishedPulling="2026-04-17 16:38:59.040973243 +0000 UTC m=+460.264301210" observedRunningTime="2026-04-17 16:38:59.755078105 +0000 UTC m=+460.978405888" watchObservedRunningTime="2026-04-17 16:38:59.756391322 +0000 UTC m=+460.979719102" Apr 17 16:39:07.678869 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:39:07.678839 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-7ttls" Apr 17 16:39:10.689885 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:39:10.689853 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-f4gvn" Apr 17 16:39:10.731156 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:39:10.731129 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-x9f2p" Apr 17 16:39:20.729347 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:39:20.729316 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-b4lpf" Apr 17 16:40:00.858507 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.858477 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-8xv8q"] Apr 17 16:40:00.860905 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.860889 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:00.863226 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.863201 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-kjtxt\"" Apr 17 16:40:00.864286 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.864264 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 16:40:00.864376 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.864292 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 16:40:00.864376 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.864292 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 16:40:00.878607 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.878577 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-8xv8q"] Apr 17 16:40:00.905383 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.905359 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-54vqb"] Apr 17 16:40:00.907407 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.907392 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-54vqb" Apr 17 16:40:00.909894 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.909871 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 17 16:40:00.909992 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.909970 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-5rxj5\"" Apr 17 16:40:00.920945 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.920924 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-54vqb"] Apr 17 16:40:00.929316 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.929299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-cert\") pod \"kserve-controller-manager-85bb65f8c4-8xv8q\" (UID: \"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43\") " pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:00.929395 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:00.929325 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d45g2\" (UniqueName: \"kubernetes.io/projected/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-kube-api-access-d45g2\") pod \"kserve-controller-manager-85bb65f8c4-8xv8q\" (UID: \"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43\") " pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:01.030565 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.030537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-cert\") pod \"kserve-controller-manager-85bb65f8c4-8xv8q\" (UID: \"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43\") " pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:01.030717 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.030576 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d45g2\" (UniqueName: \"kubernetes.io/projected/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-kube-api-access-d45g2\") pod \"kserve-controller-manager-85bb65f8c4-8xv8q\" (UID: \"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43\") " pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:01.030717 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.030609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/57f2077a-d61e-4eaf-82ca-083edfeeb652-data\") pod \"seaweedfs-86cc847c5c-54vqb\" (UID: \"57f2077a-d61e-4eaf-82ca-083edfeeb652\") " pod="kserve/seaweedfs-86cc847c5c-54vqb" Apr 17 16:40:01.030813 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.030754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjpkl\" (UniqueName: \"kubernetes.io/projected/57f2077a-d61e-4eaf-82ca-083edfeeb652-kube-api-access-cjpkl\") pod \"seaweedfs-86cc847c5c-54vqb\" (UID: \"57f2077a-d61e-4eaf-82ca-083edfeeb652\") " pod="kserve/seaweedfs-86cc847c5c-54vqb" Apr 17 16:40:01.032910 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.032890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-cert\") pod \"kserve-controller-manager-85bb65f8c4-8xv8q\" (UID: \"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43\") " pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:01.038413 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.038387 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d45g2\" (UniqueName: \"kubernetes.io/projected/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-kube-api-access-d45g2\") pod \"kserve-controller-manager-85bb65f8c4-8xv8q\" (UID: \"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43\") " pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:01.131442 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.131369 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjpkl\" (UniqueName: \"kubernetes.io/projected/57f2077a-d61e-4eaf-82ca-083edfeeb652-kube-api-access-cjpkl\") pod \"seaweedfs-86cc847c5c-54vqb\" (UID: \"57f2077a-d61e-4eaf-82ca-083edfeeb652\") " pod="kserve/seaweedfs-86cc847c5c-54vqb" Apr 17 16:40:01.131556 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.131457 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/57f2077a-d61e-4eaf-82ca-083edfeeb652-data\") pod \"seaweedfs-86cc847c5c-54vqb\" (UID: \"57f2077a-d61e-4eaf-82ca-083edfeeb652\") " pod="kserve/seaweedfs-86cc847c5c-54vqb" Apr 17 16:40:01.131852 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.131836 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/57f2077a-d61e-4eaf-82ca-083edfeeb652-data\") pod \"seaweedfs-86cc847c5c-54vqb\" (UID: \"57f2077a-d61e-4eaf-82ca-083edfeeb652\") " pod="kserve/seaweedfs-86cc847c5c-54vqb" Apr 17 16:40:01.145819 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.145799 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjpkl\" (UniqueName: \"kubernetes.io/projected/57f2077a-d61e-4eaf-82ca-083edfeeb652-kube-api-access-cjpkl\") pod \"seaweedfs-86cc847c5c-54vqb\" (UID: \"57f2077a-d61e-4eaf-82ca-083edfeeb652\") " pod="kserve/seaweedfs-86cc847c5c-54vqb" Apr 17 16:40:01.170686 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.170663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:01.216857 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.216623 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-54vqb" Apr 17 16:40:01.295668 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.295640 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-8xv8q"] Apr 17 16:40:01.297389 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:40:01.297344 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod296ca2f8_28bd_4150_bc73_b9fb4d5a3c43.slice/crio-2355c2b908e7718c3e076741c2fabeb27a07fa7a81c2d32e2e5945ee5438fd96 WatchSource:0}: Error finding container 2355c2b908e7718c3e076741c2fabeb27a07fa7a81c2d32e2e5945ee5438fd96: Status 404 returned error can't find the container with id 2355c2b908e7718c3e076741c2fabeb27a07fa7a81c2d32e2e5945ee5438fd96 Apr 17 16:40:01.343277 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.343248 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-54vqb"] Apr 17 16:40:01.345858 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:40:01.345826 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f2077a_d61e_4eaf_82ca_083edfeeb652.slice/crio-ad2ba7e3144c864a2e8615aa4ef7a80eb58ceaa71cedee0c19b54736ecc293c9 WatchSource:0}: Error finding container ad2ba7e3144c864a2e8615aa4ef7a80eb58ceaa71cedee0c19b54736ecc293c9: Status 404 returned error can't find the container with id ad2ba7e3144c864a2e8615aa4ef7a80eb58ceaa71cedee0c19b54736ecc293c9 Apr 17 16:40:01.929719 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.929682 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-54vqb" event={"ID":"57f2077a-d61e-4eaf-82ca-083edfeeb652","Type":"ContainerStarted","Data":"ad2ba7e3144c864a2e8615aa4ef7a80eb58ceaa71cedee0c19b54736ecc293c9"} Apr 17 16:40:01.930963 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:01.930933 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" event={"ID":"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43","Type":"ContainerStarted","Data":"2355c2b908e7718c3e076741c2fabeb27a07fa7a81c2d32e2e5945ee5438fd96"} Apr 17 16:40:05.947539 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:05.947434 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" event={"ID":"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43","Type":"ContainerStarted","Data":"9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7"} Apr 17 16:40:05.947954 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:05.947639 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:05.948666 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:05.948645 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-54vqb" event={"ID":"57f2077a-d61e-4eaf-82ca-083edfeeb652","Type":"ContainerStarted","Data":"7addfc3ff969a62e110906f7062d64d16775931729b53972a8e04f0225437db8"} Apr 17 16:40:05.948806 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:05.948795 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-54vqb" Apr 17 16:40:05.964352 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:05.964316 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" podStartSLOduration=2.365241996 podStartE2EDuration="5.964304786s" podCreationTimestamp="2026-04-17 16:40:00 +0000 UTC" firstStartedPulling="2026-04-17 16:40:01.298833813 +0000 UTC m=+522.522161569" lastFinishedPulling="2026-04-17 16:40:04.897896603 +0000 UTC m=+526.121224359" observedRunningTime="2026-04-17 16:40:05.963706037 +0000 UTC m=+527.187033823" watchObservedRunningTime="2026-04-17 16:40:05.964304786 +0000 UTC m=+527.187632563" Apr 17 16:40:05.979464 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:05.979417 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-54vqb" podStartSLOduration=2.354064753 podStartE2EDuration="5.979404206s" podCreationTimestamp="2026-04-17 16:40:00 +0000 UTC" firstStartedPulling="2026-04-17 16:40:01.347126282 +0000 UTC m=+522.570454038" lastFinishedPulling="2026-04-17 16:40:04.972465735 +0000 UTC m=+526.195793491" observedRunningTime="2026-04-17 16:40:05.977719195 +0000 UTC m=+527.201046974" watchObservedRunningTime="2026-04-17 16:40:05.979404206 +0000 UTC m=+527.202731984" Apr 17 16:40:11.955053 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:11.955022 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-54vqb" Apr 17 16:40:36.267362 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.267329 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-8xv8q"] Apr 17 16:40:36.267914 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.267560 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" podUID="296ca2f8-28bd-4150-bc73-b9fb4d5a3c43" containerName="manager" containerID="cri-o://9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7" gracePeriod=10 Apr 17 16:40:36.272855 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.272830 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:36.295242 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.295220 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-plzn4"] Apr 17 16:40:36.297614 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.297599 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" Apr 17 16:40:36.306716 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.306693 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-plzn4"] Apr 17 16:40:36.402351 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.402319 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxqc4\" (UniqueName: \"kubernetes.io/projected/4c3d5f71-a5f2-47f2-9db4-410ca025746a-kube-api-access-bxqc4\") pod \"kserve-controller-manager-85bb65f8c4-plzn4\" (UID: \"4c3d5f71-a5f2-47f2-9db4-410ca025746a\") " pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" Apr 17 16:40:36.402472 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.402374 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c3d5f71-a5f2-47f2-9db4-410ca025746a-cert\") pod \"kserve-controller-manager-85bb65f8c4-plzn4\" (UID: \"4c3d5f71-a5f2-47f2-9db4-410ca025746a\") " pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" Apr 17 16:40:36.503594 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.503569 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxqc4\" (UniqueName: \"kubernetes.io/projected/4c3d5f71-a5f2-47f2-9db4-410ca025746a-kube-api-access-bxqc4\") pod \"kserve-controller-manager-85bb65f8c4-plzn4\" (UID: \"4c3d5f71-a5f2-47f2-9db4-410ca025746a\") " pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" Apr 17 16:40:36.503748 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.503625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c3d5f71-a5f2-47f2-9db4-410ca025746a-cert\") pod \"kserve-controller-manager-85bb65f8c4-plzn4\" (UID: \"4c3d5f71-a5f2-47f2-9db4-410ca025746a\") " pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" Apr 17 16:40:36.504839 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.504819 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:36.505985 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.505965 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c3d5f71-a5f2-47f2-9db4-410ca025746a-cert\") pod \"kserve-controller-manager-85bb65f8c4-plzn4\" (UID: \"4c3d5f71-a5f2-47f2-9db4-410ca025746a\") " pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" Apr 17 16:40:36.512613 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.512586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxqc4\" (UniqueName: \"kubernetes.io/projected/4c3d5f71-a5f2-47f2-9db4-410ca025746a-kube-api-access-bxqc4\") pod \"kserve-controller-manager-85bb65f8c4-plzn4\" (UID: \"4c3d5f71-a5f2-47f2-9db4-410ca025746a\") " pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" Apr 17 16:40:36.604999 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.604922 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-cert\") pod \"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43\" (UID: \"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43\") " Apr 17 16:40:36.604999 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.604988 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d45g2\" (UniqueName: \"kubernetes.io/projected/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-kube-api-access-d45g2\") pod \"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43\" (UID: \"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43\") " Apr 17 16:40:36.607059 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.607034 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-kube-api-access-d45g2" (OuterVolumeSpecName: "kube-api-access-d45g2") pod "296ca2f8-28bd-4150-bc73-b9fb4d5a3c43" (UID: "296ca2f8-28bd-4150-bc73-b9fb4d5a3c43"). InnerVolumeSpecName "kube-api-access-d45g2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:40:36.607167 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.607035 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-cert" (OuterVolumeSpecName: "cert") pod "296ca2f8-28bd-4150-bc73-b9fb4d5a3c43" (UID: "296ca2f8-28bd-4150-bc73-b9fb4d5a3c43"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:40:36.650371 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.650335 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" Apr 17 16:40:36.706512 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.706486 2574 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-cert\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:40:36.706629 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.706518 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d45g2\" (UniqueName: \"kubernetes.io/projected/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43-kube-api-access-d45g2\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:40:36.763785 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:36.763707 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-plzn4"] Apr 17 16:40:36.766351 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:40:36.766325 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c3d5f71_a5f2_47f2_9db4_410ca025746a.slice/crio-941654d0800f00e52d4289adbd7b751a98d4a8bc0e032b63323c4266429ccb79 WatchSource:0}: Error finding container 941654d0800f00e52d4289adbd7b751a98d4a8bc0e032b63323c4266429ccb79: Status 404 returned error can't find the container with id 941654d0800f00e52d4289adbd7b751a98d4a8bc0e032b63323c4266429ccb79 Apr 17 16:40:37.049898 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.049863 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" event={"ID":"4c3d5f71-a5f2-47f2-9db4-410ca025746a","Type":"ContainerStarted","Data":"941654d0800f00e52d4289adbd7b751a98d4a8bc0e032b63323c4266429ccb79"} Apr 17 16:40:37.050974 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.050949 2574 generic.go:358] "Generic (PLEG): container finished" podID="296ca2f8-28bd-4150-bc73-b9fb4d5a3c43" containerID="9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7" exitCode=0 Apr 17 16:40:37.051091 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.051012 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" Apr 17 16:40:37.051138 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.051016 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" event={"ID":"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43","Type":"ContainerDied","Data":"9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7"} Apr 17 16:40:37.051138 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.051115 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-8xv8q" event={"ID":"296ca2f8-28bd-4150-bc73-b9fb4d5a3c43","Type":"ContainerDied","Data":"2355c2b908e7718c3e076741c2fabeb27a07fa7a81c2d32e2e5945ee5438fd96"} Apr 17 16:40:37.051138 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.051133 2574 scope.go:117] "RemoveContainer" containerID="9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7" Apr 17 16:40:37.059009 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.058992 2574 scope.go:117] "RemoveContainer" containerID="9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7" Apr 17 16:40:37.059271 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:40:37.059253 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7\": container with ID starting with 9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7 not found: ID does not exist" containerID="9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7" Apr 17 16:40:37.059334 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.059280 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7"} err="failed to get container status \"9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7\": rpc error: code = NotFound desc = could not find container \"9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7\": container with ID starting with 9983fa1f42fa2aa705e1ac62a8da3a07dff85e02d544fb7b6d825d1f15ad7be7 not found: ID does not exist" Apr 17 16:40:37.070988 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.070967 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-8xv8q"] Apr 17 16:40:37.073418 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.073396 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-85bb65f8c4-8xv8q"] Apr 17 16:40:37.381669 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:37.381573 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="296ca2f8-28bd-4150-bc73-b9fb4d5a3c43" path="/var/lib/kubelet/pods/296ca2f8-28bd-4150-bc73-b9fb4d5a3c43/volumes" Apr 17 16:40:38.055846 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:38.055810 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" event={"ID":"4c3d5f71-a5f2-47f2-9db4-410ca025746a","Type":"ContainerStarted","Data":"823a0d7468292ce70bc5a5ffb1a922218e40518543a15897904f1309fb5e2c66"} Apr 17 16:40:38.056032 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:38.055913 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" Apr 17 16:40:38.073804 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:38.073758 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" podStartSLOduration=1.757958468 podStartE2EDuration="2.073747153s" podCreationTimestamp="2026-04-17 16:40:36 +0000 UTC" firstStartedPulling="2026-04-17 16:40:36.767641379 +0000 UTC m=+557.990969136" lastFinishedPulling="2026-04-17 16:40:37.083430051 +0000 UTC m=+558.306757821" observedRunningTime="2026-04-17 16:40:38.071843925 +0000 UTC m=+559.295171702" watchObservedRunningTime="2026-04-17 16:40:38.073747153 +0000 UTC m=+559.297074973" Apr 17 16:40:39.460884 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.460857 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cc6d8b846-gvnnk"] Apr 17 16:40:39.461229 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.461149 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="296ca2f8-28bd-4150-bc73-b9fb4d5a3c43" containerName="manager" Apr 17 16:40:39.461229 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.461159 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="296ca2f8-28bd-4150-bc73-b9fb4d5a3c43" containerName="manager" Apr 17 16:40:39.461229 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.461207 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="296ca2f8-28bd-4150-bc73-b9fb4d5a3c43" containerName="manager" Apr 17 16:40:39.463170 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.463155 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.465574 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.465554 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 16:40:39.465783 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.465763 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 16:40:39.466928 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.466910 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 16:40:39.467005 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.466940 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-fl6zn\"" Apr 17 16:40:39.467068 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.467003 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 16:40:39.467068 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.467050 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 16:40:39.470645 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.470625 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 16:40:39.475354 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.475332 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc6d8b846-gvnnk"] Apr 17 16:40:39.631518 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.631491 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-oauth-serving-cert\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.631683 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.631524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-service-ca\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.631683 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.631547 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-trusted-ca-bundle\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.631683 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.631632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxhbq\" (UniqueName: \"kubernetes.io/projected/298285ec-c3be-4bf5-8598-12b9a4f70d6d-kube-api-access-rxhbq\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.631813 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.631682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/298285ec-c3be-4bf5-8598-12b9a4f70d6d-console-serving-cert\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.631813 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.631760 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/298285ec-c3be-4bf5-8598-12b9a4f70d6d-console-oauth-config\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.631878 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.631821 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-console-config\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.732140 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.732072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-trusted-ca-bundle\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.732140 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.732110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxhbq\" (UniqueName: \"kubernetes.io/projected/298285ec-c3be-4bf5-8598-12b9a4f70d6d-kube-api-access-rxhbq\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.732140 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.732138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/298285ec-c3be-4bf5-8598-12b9a4f70d6d-console-serving-cert\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.732384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.732159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/298285ec-c3be-4bf5-8598-12b9a4f70d6d-console-oauth-config\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.732384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.732189 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-console-config\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.732384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.732215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-oauth-serving-cert\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.732384 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.732232 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-service-ca\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.733041 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.733012 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-service-ca\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.733041 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.733030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-trusted-ca-bundle\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.733232 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.733048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-console-config\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.733232 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.733059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/298285ec-c3be-4bf5-8598-12b9a4f70d6d-oauth-serving-cert\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.734581 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.734561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/298285ec-c3be-4bf5-8598-12b9a4f70d6d-console-serving-cert\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.734671 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.734623 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/298285ec-c3be-4bf5-8598-12b9a4f70d6d-console-oauth-config\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.741985 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.741962 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxhbq\" (UniqueName: \"kubernetes.io/projected/298285ec-c3be-4bf5-8598-12b9a4f70d6d-kube-api-access-rxhbq\") pod \"console-7cc6d8b846-gvnnk\" (UID: \"298285ec-c3be-4bf5-8598-12b9a4f70d6d\") " pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.773595 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.773573 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:39.896501 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:39.896474 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cc6d8b846-gvnnk"] Apr 17 16:40:39.898451 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:40:39.898428 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod298285ec_c3be_4bf5_8598_12b9a4f70d6d.slice/crio-79e521fb297ad74c49ee4203e3a8ea5e40daa8790d6ad513551dff5d3f7e0368 WatchSource:0}: Error finding container 79e521fb297ad74c49ee4203e3a8ea5e40daa8790d6ad513551dff5d3f7e0368: Status 404 returned error can't find the container with id 79e521fb297ad74c49ee4203e3a8ea5e40daa8790d6ad513551dff5d3f7e0368 Apr 17 16:40:40.064072 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:40.064037 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc6d8b846-gvnnk" event={"ID":"298285ec-c3be-4bf5-8598-12b9a4f70d6d","Type":"ContainerStarted","Data":"80a798f0da41b7ba1d9bd5a3d64afda98ad9922f75d29b5504e225df7af21242"} Apr 17 16:40:40.064072 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:40.064075 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cc6d8b846-gvnnk" event={"ID":"298285ec-c3be-4bf5-8598-12b9a4f70d6d","Type":"ContainerStarted","Data":"79e521fb297ad74c49ee4203e3a8ea5e40daa8790d6ad513551dff5d3f7e0368"} Apr 17 16:40:40.083461 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:40.083417 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cc6d8b846-gvnnk" podStartSLOduration=1.083402859 podStartE2EDuration="1.083402859s" podCreationTimestamp="2026-04-17 16:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:40:40.081499645 +0000 UTC m=+561.304827422" watchObservedRunningTime="2026-04-17 16:40:40.083402859 +0000 UTC m=+561.306730640" Apr 17 16:40:49.774374 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:49.774333 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:49.774374 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:49.774381 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:49.779149 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:49.779117 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:40:50.098166 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:40:50.098081 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cc6d8b846-gvnnk" Apr 17 16:41:09.064255 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:09.064219 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85bb65f8c4-plzn4" Apr 17 16:41:10.169752 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.169702 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-4rf2d"] Apr 17 16:41:10.173115 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.173097 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-4rf2d" Apr 17 16:41:10.179217 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.179195 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-wzbll\"" Apr 17 16:41:10.180719 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.180699 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 16:41:10.206564 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.206543 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-4rf2d"] Apr 17 16:41:10.268149 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.268119 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb79293-2f28-472a-be51-fe2e1dd8c1fa-cert\") pod \"odh-model-controller-696fc77849-4rf2d\" (UID: \"0cb79293-2f28-472a-be51-fe2e1dd8c1fa\") " pod="kserve/odh-model-controller-696fc77849-4rf2d" Apr 17 16:41:10.268295 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.268195 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5zmn\" (UniqueName: \"kubernetes.io/projected/0cb79293-2f28-472a-be51-fe2e1dd8c1fa-kube-api-access-v5zmn\") pod \"odh-model-controller-696fc77849-4rf2d\" (UID: \"0cb79293-2f28-472a-be51-fe2e1dd8c1fa\") " pod="kserve/odh-model-controller-696fc77849-4rf2d" Apr 17 16:41:10.369032 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.369001 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb79293-2f28-472a-be51-fe2e1dd8c1fa-cert\") pod \"odh-model-controller-696fc77849-4rf2d\" (UID: \"0cb79293-2f28-472a-be51-fe2e1dd8c1fa\") " pod="kserve/odh-model-controller-696fc77849-4rf2d" Apr 17 16:41:10.369171 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.369065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5zmn\" (UniqueName: \"kubernetes.io/projected/0cb79293-2f28-472a-be51-fe2e1dd8c1fa-kube-api-access-v5zmn\") pod \"odh-model-controller-696fc77849-4rf2d\" (UID: \"0cb79293-2f28-472a-be51-fe2e1dd8c1fa\") " pod="kserve/odh-model-controller-696fc77849-4rf2d" Apr 17 16:41:10.371379 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.371359 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cb79293-2f28-472a-be51-fe2e1dd8c1fa-cert\") pod \"odh-model-controller-696fc77849-4rf2d\" (UID: \"0cb79293-2f28-472a-be51-fe2e1dd8c1fa\") " pod="kserve/odh-model-controller-696fc77849-4rf2d" Apr 17 16:41:10.377483 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.377464 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5zmn\" (UniqueName: \"kubernetes.io/projected/0cb79293-2f28-472a-be51-fe2e1dd8c1fa-kube-api-access-v5zmn\") pod \"odh-model-controller-696fc77849-4rf2d\" (UID: \"0cb79293-2f28-472a-be51-fe2e1dd8c1fa\") " pod="kserve/odh-model-controller-696fc77849-4rf2d" Apr 17 16:41:10.483805 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.483719 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-4rf2d" Apr 17 16:41:10.601845 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:10.601822 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-4rf2d"] Apr 17 16:41:10.604253 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:41:10.604225 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cb79293_2f28_472a_be51_fe2e1dd8c1fa.slice/crio-18c46d1fd842df38b9d69c0dc23c2cd8a9bece9469b71482178aaeafb530b80b WatchSource:0}: Error finding container 18c46d1fd842df38b9d69c0dc23c2cd8a9bece9469b71482178aaeafb530b80b: Status 404 returned error can't find the container with id 18c46d1fd842df38b9d69c0dc23c2cd8a9bece9469b71482178aaeafb530b80b Apr 17 16:41:11.165947 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:11.165886 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-4rf2d" event={"ID":"0cb79293-2f28-472a-be51-fe2e1dd8c1fa","Type":"ContainerStarted","Data":"18c46d1fd842df38b9d69c0dc23c2cd8a9bece9469b71482178aaeafb530b80b"} Apr 17 16:41:13.174268 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:13.174234 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-4rf2d" event={"ID":"0cb79293-2f28-472a-be51-fe2e1dd8c1fa","Type":"ContainerStarted","Data":"d75a00ef96c1b6f74aa8e13ebfdf91ae1143f4b61b31c79b9dc7e5fc2d780c3d"} Apr 17 16:41:13.174624 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:13.174279 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-4rf2d" Apr 17 16:41:13.192625 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:13.192579 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-4rf2d" podStartSLOduration=0.756996388 podStartE2EDuration="3.192566385s" podCreationTimestamp="2026-04-17 16:41:10 +0000 UTC" firstStartedPulling="2026-04-17 16:41:10.605364989 +0000 UTC m=+591.828692745" lastFinishedPulling="2026-04-17 16:41:13.040934977 +0000 UTC m=+594.264262742" observedRunningTime="2026-04-17 16:41:13.191291845 +0000 UTC m=+594.414619636" watchObservedRunningTime="2026-04-17 16:41:13.192566385 +0000 UTC m=+594.415894160" Apr 17 16:41:19.279253 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:19.279224 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:41:19.280942 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:19.280920 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:41:24.180373 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:24.180343 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-4rf2d" Apr 17 16:41:44.078471 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.078433 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f"] Apr 17 16:41:44.086388 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.086366 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.089291 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.089265 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\"" Apr 17 16:41:44.089476 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.089305 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-graph-1-predictor-serving-cert\"" Apr 17 16:41:44.089541 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.089265 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 16:41:44.089629 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.089611 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tvtvv\"" Apr 17 16:41:44.089846 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.089814 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 16:41:44.090624 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.090603 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f"] Apr 17 16:41:44.141433 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.141406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.141554 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.141456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a67d317-a529-4322-97ba-9f8fb27dc2b7-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.141554 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.141497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7nsd\" (UniqueName: \"kubernetes.io/projected/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kube-api-access-g7nsd\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.141554 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.141532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8a67d317-a529-4322-97ba-9f8fb27dc2b7-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.241990 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.241953 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8a67d317-a529-4322-97ba-9f8fb27dc2b7-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.242279 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.242255 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.242411 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.242391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a67d317-a529-4322-97ba-9f8fb27dc2b7-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.242477 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.242433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7nsd\" (UniqueName: \"kubernetes.io/projected/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kube-api-access-g7nsd\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.242643 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.242618 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kserve-provision-location\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.242750 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.242696 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8a67d317-a529-4322-97ba-9f8fb27dc2b7-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.245473 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.245445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a67d317-a529-4322-97ba-9f8fb27dc2b7-proxy-tls\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.255657 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.255625 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7nsd\" (UniqueName: \"kubernetes.io/projected/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kube-api-access-g7nsd\") pod \"isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.399003 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.398922 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:41:44.526706 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:44.526674 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f"] Apr 17 16:41:44.529779 ip-10-0-129-144 kubenswrapper[2574]: W0417 16:41:44.529750 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a67d317_a529_4322_97ba_9f8fb27dc2b7.slice/crio-2d2ecd2b455c31f3675305877660cb97b7aa1249ac8ff2e512389b683c3c6298 WatchSource:0}: Error finding container 2d2ecd2b455c31f3675305877660cb97b7aa1249ac8ff2e512389b683c3c6298: Status 404 returned error can't find the container with id 2d2ecd2b455c31f3675305877660cb97b7aa1249ac8ff2e512389b683c3c6298 Apr 17 16:41:45.278209 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:45.278166 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" event={"ID":"8a67d317-a529-4322-97ba-9f8fb27dc2b7","Type":"ContainerStarted","Data":"2d2ecd2b455c31f3675305877660cb97b7aa1249ac8ff2e512389b683c3c6298"} Apr 17 16:41:49.293091 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:49.293052 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" event={"ID":"8a67d317-a529-4322-97ba-9f8fb27dc2b7","Type":"ContainerStarted","Data":"ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded"} Apr 17 16:41:52.304111 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:52.304067 2574 generic.go:358] "Generic (PLEG): container finished" podID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerID="ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded" exitCode=0 Apr 17 16:41:52.304520 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:41:52.304142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" event={"ID":"8a67d317-a529-4322-97ba-9f8fb27dc2b7","Type":"ContainerDied","Data":"ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded"} Apr 17 16:42:05.352832 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:05.352791 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" event={"ID":"8a67d317-a529-4322-97ba-9f8fb27dc2b7","Type":"ContainerStarted","Data":"b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912"} Apr 17 16:42:08.363723 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:08.363684 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" event={"ID":"8a67d317-a529-4322-97ba-9f8fb27dc2b7","Type":"ContainerStarted","Data":"610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b"} Apr 17 16:42:08.364220 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:08.363807 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:42:08.384546 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:08.384497 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podStartSLOduration=1.505585886 podStartE2EDuration="24.384485146s" podCreationTimestamp="2026-04-17 16:41:44 +0000 UTC" firstStartedPulling="2026-04-17 16:41:44.531771225 +0000 UTC m=+625.755098982" lastFinishedPulling="2026-04-17 16:42:07.410670471 +0000 UTC m=+648.633998242" observedRunningTime="2026-04-17 16:42:08.382221312 +0000 UTC m=+649.605549087" watchObservedRunningTime="2026-04-17 16:42:08.384485146 +0000 UTC m=+649.607812922" Apr 17 16:42:09.366922 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:09.366887 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:42:09.368172 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:09.368145 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 16:42:10.369866 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:10.369821 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 16:42:15.374142 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:15.374110 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:42:15.374623 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:15.374595 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 16:42:25.374699 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:25.374660 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 16:42:35.374872 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:35.374826 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 16:42:45.375447 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:45.375405 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 16:42:55.374785 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:42:55.374745 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 16:43:05.374920 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:05.374881 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 16:43:15.375791 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:15.375763 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:43:54.423326 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:54.423289 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f"] Apr 17 16:43:54.423954 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:54.423707 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" containerID="cri-o://b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912" gracePeriod=30 Apr 17 16:43:54.423954 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:54.423747 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kube-rbac-proxy" containerID="cri-o://610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b" gracePeriod=30 Apr 17 16:43:54.704413 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:54.704340 2574 generic.go:358] "Generic (PLEG): container finished" podID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerID="610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b" exitCode=2 Apr 17 16:43:54.704543 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:54.704416 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" event={"ID":"8a67d317-a529-4322-97ba-9f8fb27dc2b7","Type":"ContainerDied","Data":"610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b"} Apr 17 16:43:55.370439 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:55.370395 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.133.0.32:8643/healthz\": dial tcp 10.133.0.32:8643: connect: connection refused" Apr 17 16:43:55.374793 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:55.374692 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 17 16:43:58.263376 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.263351 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:43:58.338008 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.337950 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a67d317-a529-4322-97ba-9f8fb27dc2b7-proxy-tls\") pod \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " Apr 17 16:43:58.338008 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.337984 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7nsd\" (UniqueName: \"kubernetes.io/projected/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kube-api-access-g7nsd\") pod \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " Apr 17 16:43:58.338165 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.338035 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8a67d317-a529-4322-97ba-9f8fb27dc2b7-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") pod \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " Apr 17 16:43:58.338165 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.338054 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kserve-provision-location\") pod \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\" (UID: \"8a67d317-a529-4322-97ba-9f8fb27dc2b7\") " Apr 17 16:43:58.338414 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.338358 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8a67d317-a529-4322-97ba-9f8fb27dc2b7" (UID: "8a67d317-a529-4322-97ba-9f8fb27dc2b7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:43:58.338525 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.338382 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a67d317-a529-4322-97ba-9f8fb27dc2b7-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config") pod "8a67d317-a529-4322-97ba-9f8fb27dc2b7" (UID: "8a67d317-a529-4322-97ba-9f8fb27dc2b7"). InnerVolumeSpecName "isvc-sklearn-graph-1-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:43:58.340176 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.340154 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a67d317-a529-4322-97ba-9f8fb27dc2b7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8a67d317-a529-4322-97ba-9f8fb27dc2b7" (UID: "8a67d317-a529-4322-97ba-9f8fb27dc2b7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:43:58.340266 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.340233 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kube-api-access-g7nsd" (OuterVolumeSpecName: "kube-api-access-g7nsd") pod "8a67d317-a529-4322-97ba-9f8fb27dc2b7" (UID: "8a67d317-a529-4322-97ba-9f8fb27dc2b7"). InnerVolumeSpecName "kube-api-access-g7nsd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:43:58.439510 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.439477 2574 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8a67d317-a529-4322-97ba-9f8fb27dc2b7-isvc-sklearn-graph-1-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.439510 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.439503 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kserve-provision-location\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.439510 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.439513 2574 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8a67d317-a529-4322-97ba-9f8fb27dc2b7-proxy-tls\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.439818 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.439522 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g7nsd\" (UniqueName: \"kubernetes.io/projected/8a67d317-a529-4322-97ba-9f8fb27dc2b7-kube-api-access-g7nsd\") on node \"ip-10-0-129-144.ec2.internal\" DevicePath \"\"" Apr 17 16:43:58.718779 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.718665 2574 generic.go:358] "Generic (PLEG): container finished" podID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerID="b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912" exitCode=0 Apr 17 16:43:58.718779 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.718763 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" event={"ID":"8a67d317-a529-4322-97ba-9f8fb27dc2b7","Type":"ContainerDied","Data":"b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912"} Apr 17 16:43:58.718779 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.718776 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" Apr 17 16:43:58.719047 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.718806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f" event={"ID":"8a67d317-a529-4322-97ba-9f8fb27dc2b7","Type":"ContainerDied","Data":"2d2ecd2b455c31f3675305877660cb97b7aa1249ac8ff2e512389b683c3c6298"} Apr 17 16:43:58.719047 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.718823 2574 scope.go:117] "RemoveContainer" containerID="610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b" Apr 17 16:43:58.726948 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.726929 2574 scope.go:117] "RemoveContainer" containerID="b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912" Apr 17 16:43:58.733937 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.733913 2574 scope.go:117] "RemoveContainer" containerID="ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded" Apr 17 16:43:58.739630 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.739595 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f"] Apr 17 16:43:58.741214 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.741193 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-1-predictor-5b497dcd98-gqt9f"] Apr 17 16:43:58.742092 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.742079 2574 scope.go:117] "RemoveContainer" containerID="610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b" Apr 17 16:43:58.742331 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:43:58.742314 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b\": container with ID starting with 610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b not found: ID does not exist" containerID="610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b" Apr 17 16:43:58.742377 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.742338 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b"} err="failed to get container status \"610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b\": rpc error: code = NotFound desc = could not find container \"610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b\": container with ID starting with 610aa7c8d4b567dfa10c0a58186a7da60c01b6034f3ea86de763176bfc16308b not found: ID does not exist" Apr 17 16:43:58.742377 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.742356 2574 scope.go:117] "RemoveContainer" containerID="b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912" Apr 17 16:43:58.742561 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:43:58.742542 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912\": container with ID starting with b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912 not found: ID does not exist" containerID="b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912" Apr 17 16:43:58.742604 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.742568 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912"} err="failed to get container status \"b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912\": rpc error: code = NotFound desc = could not find container \"b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912\": container with ID starting with b49d011e1d3752ae0db3bb293c229ee41de24d9b1ab8de93d8bb048400408912 not found: ID does not exist" Apr 17 16:43:58.742604 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.742585 2574 scope.go:117] "RemoveContainer" containerID="ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded" Apr 17 16:43:58.742847 ip-10-0-129-144 kubenswrapper[2574]: E0417 16:43:58.742830 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded\": container with ID starting with ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded not found: ID does not exist" containerID="ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded" Apr 17 16:43:58.742913 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:58.742850 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded"} err="failed to get container status \"ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded\": rpc error: code = NotFound desc = could not find container \"ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded\": container with ID starting with ea35591fc834d936fbab1e7887da10321f77fb87ce6eb02b75b318dcb9758ded not found: ID does not exist" Apr 17 16:43:59.382458 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:43:59.382426 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" path="/var/lib/kubelet/pods/8a67d317-a529-4322-97ba-9f8fb27dc2b7/volumes" Apr 17 16:46:19.300425 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:46:19.300345 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:46:19.303904 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:46:19.303883 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:51:19.322795 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:51:19.322762 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:51:19.326648 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:51:19.326627 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:56:19.344241 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:56:19.344211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 16:56:19.348583 ip-10-0-129-144 kubenswrapper[2574]: I0417 16:56:19.348564 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:01:19.370557 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:01:19.370528 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:01:19.375317 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:01:19.375298 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:06:19.400002 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:06:19.399971 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:06:19.402336 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:06:19.402269 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:11:19.421710 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:11:19.421608 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:11:19.425555 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:11:19.423909 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:16:19.442381 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:16:19.442255 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:16:19.446322 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:16:19.445420 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:21:19.464070 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:19.463968 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:21:19.468057 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:19.468041 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:21:42.652962 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:42.652934 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5phzj_a437f6b6-ca4b-4b4f-b5b1-5f126c2fac1c/global-pull-secret-syncer/0.log" Apr 17 17:21:42.917524 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:42.917449 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xztmj_b67a3101-cd16-466d-bb65-1fcd6158a8f4/konnectivity-agent/0.log" Apr 17 17:21:42.942885 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:42.942855 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-144.ec2.internal_c1defed80364f63d2790d12bb9e06eb2/haproxy/0.log" Apr 17 17:21:46.409075 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:46.409044 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-8psjj_857db8bf-55a7-4dbe-a3e9-277f452b9fb9/cluster-monitoring-operator/0.log" Apr 17 17:21:46.551427 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:46.551394 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-gq92p_5820ffd9-b681-4007-8950-dab81ba98039/monitoring-plugin/0.log" Apr 17 17:21:46.586286 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:46.586261 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hh4b6_256d82eb-b394-49ee-aaa8-6beac670a01e/node-exporter/0.log" Apr 17 17:21:46.617975 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:46.617956 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hh4b6_256d82eb-b394-49ee-aaa8-6beac670a01e/kube-rbac-proxy/0.log" Apr 17 17:21:46.641423 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:46.641403 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hh4b6_256d82eb-b394-49ee-aaa8-6beac670a01e/init-textfile/0.log" Apr 17 17:21:47.237084 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:47.237051 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54dd8cc7bb-rwpx2_44386569-af8e-4972-bf54-4134fe78214e/telemeter-client/0.log" Apr 17 17:21:47.262835 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:47.262805 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54dd8cc7bb-rwpx2_44386569-af8e-4972-bf54-4134fe78214e/reload/0.log" Apr 17 17:21:47.284769 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:47.284727 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-54dd8cc7bb-rwpx2_44386569-af8e-4972-bf54-4134fe78214e/kube-rbac-proxy/0.log" Apr 17 17:21:49.348170 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.348138 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7cc6d8b846-gvnnk_298285ec-c3be-4bf5-8598-12b9a4f70d6d/console/0.log" Apr 17 17:21:49.400749 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.400700 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-tbr2k_21afef29-d7fa-4797-9a01-18075be87fb6/download-server/0.log" Apr 17 17:21:49.536520 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.536490 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr"] Apr 17 17:21:49.536833 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.536805 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="storage-initializer" Apr 17 17:21:49.536833 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.536822 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="storage-initializer" Apr 17 17:21:49.537021 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.536843 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kube-rbac-proxy" Apr 17 17:21:49.537021 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.536852 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kube-rbac-proxy" Apr 17 17:21:49.537021 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.536865 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" Apr 17 17:21:49.537021 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.536873 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" Apr 17 17:21:49.537021 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.536962 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kserve-container" Apr 17 17:21:49.537021 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.536977 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a67d317-a529-4322-97ba-9f8fb27dc2b7" containerName="kube-rbac-proxy" Apr 17 17:21:49.540040 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.540020 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.543642 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.543622 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmwvb\"/\"kube-root-ca.crt\"" Apr 17 17:21:49.544611 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.544594 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tmwvb\"/\"openshift-service-ca.crt\"" Apr 17 17:21:49.544711 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.544626 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tmwvb\"/\"default-dockercfg-dkhbl\"" Apr 17 17:21:49.549199 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.549179 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr"] Apr 17 17:21:49.692571 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.692489 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-podres\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.692571 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.692537 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-proc\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.692778 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.692599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-lib-modules\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.692778 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.692645 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt9k5\" (UniqueName: \"kubernetes.io/projected/0b3196f7-47e1-451e-a307-e232fba9e605-kube-api-access-kt9k5\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.692778 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.692673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-sys\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.793984 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.793942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-podres\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.794182 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.793999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-proc\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.794182 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.794046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-lib-modules\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.794182 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.794073 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-proc\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.794182 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.794101 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-podres\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.794182 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.794103 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt9k5\" (UniqueName: \"kubernetes.io/projected/0b3196f7-47e1-451e-a307-e232fba9e605-kube-api-access-kt9k5\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.794182 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.794157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-sys\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.794182 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.794170 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-lib-modules\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.794458 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.794246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b3196f7-47e1-451e-a307-e232fba9e605-sys\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.805304 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.805279 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt9k5\" (UniqueName: \"kubernetes.io/projected/0b3196f7-47e1-451e-a307-e232fba9e605-kube-api-access-kt9k5\") pod \"perf-node-gather-daemonset-qcbxr\" (UID: \"0b3196f7-47e1-451e-a307-e232fba9e605\") " pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.850580 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.850555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:49.852682 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.852659 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-q49fp_7324ef3b-f554-4b99-8416-ea798d8b7d08/volume-data-source-validator/0.log" Apr 17 17:21:49.971716 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.971689 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr"] Apr 17 17:21:49.974354 ip-10-0-129-144 kubenswrapper[2574]: W0417 17:21:49.974322 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0b3196f7_47e1_451e_a307_e232fba9e605.slice/crio-2637864cd95600e174608f68693b67f1956161408c420c4af38419a055d3aa63 WatchSource:0}: Error finding container 2637864cd95600e174608f68693b67f1956161408c420c4af38419a055d3aa63: Status 404 returned error can't find the container with id 2637864cd95600e174608f68693b67f1956161408c420c4af38419a055d3aa63 Apr 17 17:21:49.975971 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.975955 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:21:49.999765 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:49.999741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" event={"ID":"0b3196f7-47e1-451e-a307-e232fba9e605","Type":"ContainerStarted","Data":"2637864cd95600e174608f68693b67f1956161408c420c4af38419a055d3aa63"} Apr 17 17:21:50.648272 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:50.648240 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vpvzg_06db4982-9078-40f4-a267-e270f44de092/dns/0.log" Apr 17 17:21:50.673104 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:50.673077 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vpvzg_06db4982-9078-40f4-a267-e270f44de092/kube-rbac-proxy/0.log" Apr 17 17:21:50.757998 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:50.757970 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-pbmrc_ec869d90-6e5f-4329-9d3c-62938cb140e5/dns-node-resolver/0.log" Apr 17 17:21:51.004761 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:51.004657 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" event={"ID":"0b3196f7-47e1-451e-a307-e232fba9e605","Type":"ContainerStarted","Data":"ff2203f14c5b22e283b0cf865770e4d173890a312f0b9024465bb6e4c08afe16"} Apr 17 17:21:51.004920 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:51.004770 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:21:51.020400 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:51.020356 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" podStartSLOduration=2.020343688 podStartE2EDuration="2.020343688s" podCreationTimestamp="2026-04-17 17:21:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:21:51.019754037 +0000 UTC m=+3032.243081822" watchObservedRunningTime="2026-04-17 17:21:51.020343688 +0000 UTC m=+3032.243671466" Apr 17 17:21:51.184495 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:51.184463 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5cfd58fd7f-ztnxd_ef63f7b9-fc9e-4558-85c1-fd232a8cc5c0/registry/0.log" Apr 17 17:21:51.258310 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:51.258239 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6jdwh_e613ab88-5ce8-4dda-a30c-57006804fdb0/node-ca/0.log" Apr 17 17:21:52.320907 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:52.320884 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jvmkj_34b1d336-8626-43f1-8ced-2764a72b207b/serve-healthcheck-canary/0.log" Apr 17 17:21:53.012201 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:53.012173 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r6l2s_a65f5aab-b1e8-48b9-80dc-4d625056d509/kube-rbac-proxy/0.log" Apr 17 17:21:53.037045 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:53.037022 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r6l2s_a65f5aab-b1e8-48b9-80dc-4d625056d509/exporter/0.log" Apr 17 17:21:53.068677 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:53.068656 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-r6l2s_a65f5aab-b1e8-48b9-80dc-4d625056d509/extractor/0.log" Apr 17 17:21:55.080593 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:55.080562 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-85bb65f8c4-plzn4_4c3d5f71-a5f2-47f2-9db4-410ca025746a/manager/0.log" Apr 17 17:21:55.653853 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:55.653821 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-4rf2d_0cb79293-2f28-472a-be51-fe2e1dd8c1fa/manager/0.log" Apr 17 17:21:55.718387 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:55.718344 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-54vqb_57f2077a-d61e-4eaf-82ca-083edfeeb652/seaweedfs/0.log" Apr 17 17:21:57.017449 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:21:57.017418 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tmwvb/perf-node-gather-daemonset-qcbxr" Apr 17 17:22:00.128497 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:00.128460 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wkqhn_9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf/kube-storage-version-migrator-operator/1.log" Apr 17 17:22:00.130109 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:00.130080 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-wkqhn_9bb3a924-c4e5-4338-9f7c-b9efa83f3cdf/kube-storage-version-migrator-operator/0.log" Apr 17 17:22:01.070684 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:01.070659 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6m4wt_2a3eeab2-52f2-4ba5-a534-bef4430448f8/kube-multus-additional-cni-plugins/0.log" Apr 17 17:22:01.097321 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:01.097254 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6m4wt_2a3eeab2-52f2-4ba5-a534-bef4430448f8/egress-router-binary-copy/0.log" Apr 17 17:22:01.124609 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:01.124582 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6m4wt_2a3eeab2-52f2-4ba5-a534-bef4430448f8/cni-plugins/0.log" Apr 17 17:22:01.149453 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:01.149430 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6m4wt_2a3eeab2-52f2-4ba5-a534-bef4430448f8/bond-cni-plugin/0.log" Apr 17 17:22:01.174633 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:01.174609 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6m4wt_2a3eeab2-52f2-4ba5-a534-bef4430448f8/routeoverride-cni/0.log" Apr 17 17:22:01.202538 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:01.202513 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6m4wt_2a3eeab2-52f2-4ba5-a534-bef4430448f8/whereabouts-cni-bincopy/0.log" Apr 17 17:22:01.227225 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:01.227199 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6m4wt_2a3eeab2-52f2-4ba5-a534-bef4430448f8/whereabouts-cni/0.log" Apr 17 17:22:01.720055 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:01.720021 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t7lqr_1a680fa9-f376-482e-af97-a722bc5b37c6/kube-multus/0.log" Apr 17 17:22:01.741079 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:01.741053 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-42tgv_b57612e9-f335-4d71-bdba-f06f0735eee1/network-metrics-daemon/0.log" Apr 17 17:22:01.763833 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:01.763806 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-42tgv_b57612e9-f335-4d71-bdba-f06f0735eee1/kube-rbac-proxy/0.log" Apr 17 17:22:03.034485 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:03.034456 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-controller/0.log" Apr 17 17:22:03.057393 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:03.057369 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/0.log" Apr 17 17:22:03.083430 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:03.083403 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovn-acl-logging/1.log" Apr 17 17:22:03.108512 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:03.108484 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/kube-rbac-proxy-node/0.log" Apr 17 17:22:03.132804 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:03.132779 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:22:03.155077 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:03.155053 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/northd/0.log" Apr 17 17:22:03.180865 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:03.180845 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/nbdb/0.log" Apr 17 17:22:03.212928 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:03.212896 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/sbdb/0.log" Apr 17 17:22:03.314182 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:03.314102 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bv26r_677ce4c5-2494-4409-bb8c-263a71ca26d1/ovnkube-controller/0.log" Apr 17 17:22:04.730748 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:04.730701 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-gwj7d_cad60853-6c2f-411f-935c-f5890843bbf1/network-check-target-container/0.log" Apr 17 17:22:05.711941 ip-10-0-129-144 kubenswrapper[2574]: I0417 17:22:05.711854 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tm6mx_318876c9-9879-4dc1-a7cb-9664f3196aa9/iptables-alerter/0.log"