Apr 17 10:14:55.256768 ip-10-0-128-180 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 10:14:55.256780 ip-10-0-128-180 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 10:14:55.256790 ip-10-0-128-180 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 10:14:55.257155 ip-10-0-128-180 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 10:15:05.412239 ip-10-0-128-180 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 10:15:05.412257 ip-10-0-128-180 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 3eafeb8b80604f5aa74fb9863e3576c0 -- Apr 17 10:17:20.128741 ip-10-0-128-180 systemd[1]: Starting Kubernetes Kubelet... Apr 17 10:17:20.636002 ip-10-0-128-180 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 10:17:20.636002 ip-10-0-128-180 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 10:17:20.636002 ip-10-0-128-180 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 10:17:20.636002 ip-10-0-128-180 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 10:17:20.636002 ip-10-0-128-180 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 10:17:20.637776 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.637684 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 10:17:20.640254 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640232 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 10:17:20.640254 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640251 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 10:17:20.640254 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640258 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640263 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640268 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640272 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640276 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640279 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640282 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640286 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640289 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640293 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640297 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640301 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640304 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640308 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640312 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640316 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640319 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640328 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640332 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640336 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 10:17:20.640473 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640340 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640344 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640348 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640352 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640356 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640360 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640363 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640368 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640372 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640376 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640380 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640384 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640388 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640392 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640399 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640403 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640407 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640411 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640436 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 10:17:20.641311 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640442 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640447 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640452 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640457 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640462 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640467 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640471 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640475 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640479 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640484 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640488 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640492 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640496 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640501 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640505 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640509 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640513 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640518 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640522 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640526 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 10:17:20.642153 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640531 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640535 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640538 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640542 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640549 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640554 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640558 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640563 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640568 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640572 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640576 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640580 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640584 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640589 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640595 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640599 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640603 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640608 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640612 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640616 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 10:17:20.643037 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640620 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640624 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640628 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640632 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.640636 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641308 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641319 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641324 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641328 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641333 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641337 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641342 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641346 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641352 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641357 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641361 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641366 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641370 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641374 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641379 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 10:17:20.643717 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641383 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641387 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641392 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641397 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641401 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641406 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641429 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641434 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641438 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641442 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641447 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641451 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641455 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641459 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641463 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641467 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641471 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641475 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641480 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641485 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 10:17:20.644189 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641489 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641493 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641497 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641502 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641506 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641510 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641514 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641518 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641522 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641526 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641530 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641535 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641539 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641543 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641550 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641555 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641560 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641564 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641568 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 10:17:20.644933 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641574 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641578 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641582 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641586 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641591 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641595 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641599 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641605 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641609 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641612 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641617 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641621 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641625 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641629 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641636 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641642 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641647 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641652 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641656 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 10:17:20.645445 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641661 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641665 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641669 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641673 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641678 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641683 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641687 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641691 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641695 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641699 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641704 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641710 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.641715 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641822 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641837 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641851 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641857 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641864 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641871 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641878 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641885 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 10:17:20.645910 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641891 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641896 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641901 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641907 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641912 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641917 2579 flags.go:64] FLAG: --cgroup-root="" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641928 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641933 2579 flags.go:64] FLAG: --client-ca-file="" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641938 2579 flags.go:64] FLAG: --cloud-config="" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641943 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641948 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641958 2579 flags.go:64] FLAG: --cluster-domain="" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641967 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641972 2579 flags.go:64] FLAG: --config-dir="" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641977 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641982 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641989 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641993 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.641999 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642005 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642009 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642014 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642019 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642024 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642028 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 10:17:20.646540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642035 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642051 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642056 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642061 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642066 2579 flags.go:64] FLAG: --enable-server="true" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642070 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642081 2579 flags.go:64] FLAG: --event-burst="100" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642086 2579 flags.go:64] FLAG: --event-qps="50" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642091 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642096 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642101 2579 flags.go:64] FLAG: --eviction-hard="" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642107 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642111 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642118 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642124 2579 flags.go:64] FLAG: --eviction-soft="" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642128 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642133 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642138 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642143 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642149 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642153 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642158 2579 flags.go:64] FLAG: --feature-gates="" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642164 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642168 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642173 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 10:17:20.647163 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642179 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642184 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642189 2579 flags.go:64] FLAG: --help="false" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642194 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-128-180.ec2.internal" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642199 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642204 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642209 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642214 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642220 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642233 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642239 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642244 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642249 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642254 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642259 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642263 2579 flags.go:64] FLAG: --kube-reserved="" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642268 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642273 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642278 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642283 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642290 2579 flags.go:64] FLAG: --lock-file="" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642294 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642299 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642304 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 10:17:20.647789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642319 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642324 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642331 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642335 2579 flags.go:64] FLAG: --logging-format="text" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642340 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642346 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642350 2579 flags.go:64] FLAG: --manifest-url="" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642355 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642362 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642367 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642374 2579 flags.go:64] FLAG: --max-pods="110" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642379 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642384 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642388 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642393 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642398 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642403 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642408 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642445 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642450 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642455 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642460 2579 flags.go:64] FLAG: --pod-cidr="" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642465 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 10:17:20.648355 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642474 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642479 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642484 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642489 2579 flags.go:64] FLAG: --port="10250" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642494 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642500 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a3be0e5914b02375" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642505 2579 flags.go:64] FLAG: --qos-reserved="" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642510 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642515 2579 flags.go:64] FLAG: --register-node="true" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642520 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642525 2579 flags.go:64] FLAG: --register-with-taints="" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642531 2579 flags.go:64] FLAG: --registry-burst="10" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642537 2579 flags.go:64] FLAG: --registry-qps="5" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642542 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642547 2579 flags.go:64] FLAG: --reserved-memory="" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642553 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642558 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642563 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642567 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642572 2579 flags.go:64] FLAG: --runonce="false" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642577 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642582 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642587 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642591 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642596 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642601 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 10:17:20.648920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642606 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642611 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642617 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642622 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642626 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642631 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642636 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642641 2579 flags.go:64] FLAG: --system-cgroups="" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642646 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642655 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642660 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642666 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642676 2579 flags.go:64] FLAG: --tls-min-version="" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642681 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642685 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642690 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642695 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642700 2579 flags.go:64] FLAG: --v="2" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642709 2579 flags.go:64] FLAG: --version="false" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642716 2579 flags.go:64] FLAG: --vmodule="" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642723 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.642729 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642886 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642894 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 10:17:20.649638 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642900 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642905 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642909 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642914 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642919 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642924 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642928 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642932 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642936 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642940 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642944 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642949 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642954 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642958 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642962 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642966 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642970 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642974 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642979 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642985 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 10:17:20.650215 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642989 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642993 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.642998 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643003 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643007 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643011 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643018 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643022 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643026 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643030 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643034 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643038 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643042 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643045 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643049 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643053 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643057 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643061 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643065 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643069 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 10:17:20.650782 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643073 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643077 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643081 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643085 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643090 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643094 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643099 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643103 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643107 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643111 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643115 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643121 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643126 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643130 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643134 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643138 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643142 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643148 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643155 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643159 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 10:17:20.651277 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643163 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643168 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643172 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643176 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643180 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643184 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643188 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643192 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643196 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643200 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643205 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643209 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643213 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643217 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643221 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643225 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643229 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643234 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643238 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643243 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 10:17:20.651814 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643247 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643252 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643255 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.643262 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.643273 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.651384 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.651400 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651462 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651468 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651472 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651475 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651478 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651481 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651486 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651490 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651493 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 10:17:20.652310 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651496 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651499 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651502 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651505 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651508 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651511 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651513 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651516 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651518 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651521 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651524 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651527 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651530 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651533 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651536 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651539 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651541 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651544 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651547 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651549 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 10:17:20.652733 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651552 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651555 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651558 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651561 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651563 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651566 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651569 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651571 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651574 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651578 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651581 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651584 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651587 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651590 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651593 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651595 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651598 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651601 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651603 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 10:17:20.653217 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651606 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651608 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651611 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651613 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651616 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651618 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651621 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651624 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651627 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651629 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651632 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651634 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651637 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651640 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651642 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651645 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651648 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651651 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651654 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 10:17:20.653775 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651656 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651659 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651662 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651665 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651667 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651670 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651673 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651675 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651677 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651680 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651682 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651685 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651687 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651690 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651692 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651695 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651698 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651700 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 10:17:20.654267 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651703 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.651708 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651809 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651815 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651818 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651821 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651824 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651826 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651830 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651832 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651835 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651838 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651841 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651843 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651847 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651849 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 10:17:20.654724 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651852 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651854 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651857 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651859 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651862 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651865 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651867 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651870 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651872 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651875 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651877 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651880 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651882 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651885 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651888 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651890 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651893 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651896 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651898 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651901 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 10:17:20.655124 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651904 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651907 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651909 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651912 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651915 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651917 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651921 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651924 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651926 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651929 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651931 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651934 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651937 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651939 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651942 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651944 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651947 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651949 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651952 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651955 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 10:17:20.655630 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651958 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651960 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651963 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651965 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651968 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651970 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651973 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651976 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651978 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651981 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651984 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651987 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651991 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651995 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.651997 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652000 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652002 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652005 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652007 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652010 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 10:17:20.656112 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652012 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652015 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652017 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652020 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652022 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652025 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652027 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652030 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652034 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652037 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652040 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:20.652043 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.652048 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.652896 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.655281 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 10:17:20.656616 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.656381 2579 server.go:1019] "Starting client certificate rotation" Apr 17 10:17:20.656987 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.656480 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 10:17:20.657370 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.657358 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 10:17:20.683649 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.683622 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 10:17:20.691659 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.691635 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 10:17:20.707454 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.707431 2579 log.go:25] "Validated CRI v1 runtime API" Apr 17 10:17:20.713219 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.713199 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 10:17:20.714495 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.714482 2579 log.go:25] "Validated CRI v1 image API" Apr 17 10:17:20.715966 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.715949 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 10:17:20.721978 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.721813 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b5d10833-a6f1-4a00-b72a-b5757be1f825:/dev/nvme0n1p4 fc21848a-7741-41bf-a206-27e3d570a689:/dev/nvme0n1p3] Apr 17 10:17:20.722028 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.721979 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 10:17:20.727367 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.727253 2579 manager.go:217] Machine: {Timestamp:2026-04-17 10:17:20.726002606 +0000 UTC m=+0.462188949 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3089976 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec250ce0923cf8966e4c7016145b433f SystemUUID:ec250ce0-923c-f896-6e4c-7016145b433f BootID:3eafeb8b-8060-4f5a-a74f-b9863e3576c0 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:60:b2:7b:13:3f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:60:b2:7b:13:3f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:a2:10:df:31:f2:2b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 10:17:20.727367 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.727362 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 10:17:20.727538 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.727463 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 10:17:20.728718 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.728690 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 10:17:20.728876 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.728720 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-180.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 10:17:20.728921 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.728886 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 10:17:20.728921 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.728895 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 10:17:20.728921 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.728908 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 10:17:20.729002 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.728923 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 10:17:20.730293 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.730283 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 10:17:20.730404 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.730394 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 10:17:20.732933 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.732922 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 17 10:17:20.732973 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.732936 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 10:17:20.733698 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.733688 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 10:17:20.733735 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.733703 2579 kubelet.go:397] "Adding apiserver pod source" Apr 17 10:17:20.733735 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.733713 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 10:17:20.734900 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.734888 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 10:17:20.734948 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.734926 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 10:17:20.739813 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.739776 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 10:17:20.742124 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.742106 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 10:17:20.744282 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744259 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 10:17:20.744390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744290 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 10:17:20.744390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744305 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 10:17:20.744390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744315 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 10:17:20.744390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744329 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 10:17:20.744390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744338 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 10:17:20.744390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744345 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 10:17:20.744390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744354 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 10:17:20.744390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744365 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 10:17:20.744390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744374 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 10:17:20.744390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744386 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 10:17:20.744703 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.744399 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 10:17:20.746277 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.746260 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 10:17:20.746277 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.746278 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 10:17:20.746747 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.746711 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 10:17:20.746818 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.746711 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-180.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 10:17:20.750319 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.750304 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 10:17:20.750405 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.750350 2579 server.go:1295] "Started kubelet" Apr 17 10:17:20.750481 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.750447 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 10:17:20.751098 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.751048 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 10:17:20.751161 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.751122 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 10:17:20.751122 ip-10-0-128-180 systemd[1]: Started Kubernetes Kubelet. Apr 17 10:17:20.753150 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.753132 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 10:17:20.753995 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.753977 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 17 10:17:20.756529 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.756508 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vr5xd" Apr 17 10:17:20.757917 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.757893 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-180.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 10:17:20.759100 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.757954 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-180.ec2.internal.18a71d865173ce72 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-180.ec2.internal,UID:ip-10-0-128-180.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-180.ec2.internal,},FirstTimestamp:2026-04-17 10:17:20.75031717 +0000 UTC m=+0.486503514,LastTimestamp:2026-04-17 10:17:20.75031717 +0000 UTC m=+0.486503514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-180.ec2.internal,}" Apr 17 10:17:20.759713 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.759697 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 10:17:20.759803 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.759740 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 10:17:20.760220 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.760200 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 10:17:20.760537 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760517 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 10:17:20.760537 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760518 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 10:17:20.760671 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760565 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 10:17:20.760671 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760598 2579 factory.go:55] Registering systemd factory Apr 17 10:17:20.760671 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760618 2579 factory.go:223] Registration of the systemd container factory successfully Apr 17 10:17:20.760671 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760637 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 17 10:17:20.760671 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760647 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 17 10:17:20.760874 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.760716 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:20.760874 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760813 2579 factory.go:153] Registering CRI-O factory Apr 17 10:17:20.760874 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760825 2579 factory.go:223] Registration of the crio container factory successfully Apr 17 10:17:20.760999 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760875 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 10:17:20.760999 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760904 2579 factory.go:103] Registering Raw factory Apr 17 10:17:20.760999 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.760921 2579 manager.go:1196] Started watching for new ooms in manager Apr 17 10:17:20.761266 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.761254 2579 manager.go:319] Starting recovery of all containers Apr 17 10:17:20.763728 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.763698 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-180.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 10:17:20.763819 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.763789 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 10:17:20.764816 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.764796 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vr5xd" Apr 17 10:17:20.771250 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.771187 2579 manager.go:324] Recovery completed Apr 17 10:17:20.776221 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.776202 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 10:17:20.778640 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.778622 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 10:17:20.778706 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.778660 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 10:17:20.778706 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.778678 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 10:17:20.779270 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.779253 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 10:17:20.779270 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.779267 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 10:17:20.779364 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.779283 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 17 10:17:20.780504 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.780403 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-180.ec2.internal.18a71d86532402d9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-180.ec2.internal,UID:ip-10-0-128-180.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-180.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-180.ec2.internal,},FirstTimestamp:2026-04-17 10:17:20.778642137 +0000 UTC m=+0.514828483,LastTimestamp:2026-04-17 10:17:20.778642137 +0000 UTC m=+0.514828483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-180.ec2.internal,}" Apr 17 10:17:20.783478 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.783464 2579 policy_none.go:49] "None policy: Start" Apr 17 10:17:20.783536 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.783482 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 10:17:20.783536 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.783492 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 17 10:17:20.821080 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.821061 2579 manager.go:341] "Starting Device Plugin manager" Apr 17 10:17:20.846137 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.821188 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 10:17:20.846137 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.821204 2579 server.go:85] "Starting device plugin registration server" Apr 17 10:17:20.846137 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.821464 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 10:17:20.846137 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.821479 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 10:17:20.846137 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.821563 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 10:17:20.846137 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.821644 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 10:17:20.846137 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.821651 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 10:17:20.846137 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.822449 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 10:17:20.846137 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.822485 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:20.862160 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.862124 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 10:17:20.863380 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.863356 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 10:17:20.863380 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.863384 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 10:17:20.863548 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.863409 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 10:17:20.863548 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.863432 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 10:17:20.863548 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.863526 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 10:17:20.865643 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.865626 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 10:17:20.922637 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.922545 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 10:17:20.923500 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.923483 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 10:17:20.923604 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.923515 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 10:17:20.923604 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.923525 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 10:17:20.923604 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.923549 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-180.ec2.internal" Apr 17 10:17:20.934424 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.934387 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-180.ec2.internal" Apr 17 10:17:20.934547 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.934436 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-180.ec2.internal\": node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:20.953512 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:20.953482 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:20.963810 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.963783 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal"] Apr 17 10:17:20.963872 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.963864 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 10:17:20.966230 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.966210 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 10:17:20.966320 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.966244 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 10:17:20.966320 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.966254 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 10:17:20.968589 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.968576 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 10:17:20.968726 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.968713 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal" Apr 17 10:17:20.968763 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.968744 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 10:17:20.969326 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.969306 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 10:17:20.969409 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.969313 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 10:17:20.969409 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.969370 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 10:17:20.969409 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.969383 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 10:17:20.969409 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.969335 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 10:17:20.969554 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.969411 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 10:17:20.971629 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.971610 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" Apr 17 10:17:20.971716 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.971659 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 10:17:20.972566 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.972550 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientMemory" Apr 17 10:17:20.972656 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.972581 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 10:17:20.972656 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:20.972594 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeHasSufficientPID" Apr 17 10:17:21.000296 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.000265 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-180.ec2.internal\" not found" node="ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.004885 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.004863 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-180.ec2.internal\" not found" node="ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.054398 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.054363 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:21.061728 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.061705 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9b8fe15074523188d471c5f98f7d062-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal\" (UID: \"e9b8fe15074523188d471c5f98f7d062\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.061825 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.061741 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/63dc35addb937cfe54aeec10c17385f5-config\") pod \"kube-apiserver-proxy-ip-10-0-128-180.ec2.internal\" (UID: \"63dc35addb937cfe54aeec10c17385f5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.061825 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.061767 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9b8fe15074523188d471c5f98f7d062-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal\" (UID: \"e9b8fe15074523188d471c5f98f7d062\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.154865 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.154834 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:21.162199 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.162179 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/63dc35addb937cfe54aeec10c17385f5-config\") pod \"kube-apiserver-proxy-ip-10-0-128-180.ec2.internal\" (UID: \"63dc35addb937cfe54aeec10c17385f5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.162253 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.162208 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9b8fe15074523188d471c5f98f7d062-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal\" (UID: \"e9b8fe15074523188d471c5f98f7d062\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.162294 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.162251 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9b8fe15074523188d471c5f98f7d062-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal\" (UID: \"e9b8fe15074523188d471c5f98f7d062\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.162294 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.162271 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/63dc35addb937cfe54aeec10c17385f5-config\") pod \"kube-apiserver-proxy-ip-10-0-128-180.ec2.internal\" (UID: \"63dc35addb937cfe54aeec10c17385f5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.162358 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.162301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9b8fe15074523188d471c5f98f7d062-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal\" (UID: \"e9b8fe15074523188d471c5f98f7d062\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.162358 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.162308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9b8fe15074523188d471c5f98f7d062-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal\" (UID: \"e9b8fe15074523188d471c5f98f7d062\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.255675 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.255594 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:21.302112 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.302070 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.307591 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.307568 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" Apr 17 10:17:21.356750 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.356708 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:21.457175 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.457140 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:21.557736 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.557701 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:21.656349 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.656315 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 10:17:21.657002 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.656501 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 10:17:21.658473 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.658449 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:21.681636 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.681603 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 10:17:21.758867 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.758814 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:21.759940 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.759923 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 10:17:21.768803 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.768779 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 10:17:21.769586 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.769554 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 10:12:20 +0000 UTC" deadline="2027-11-16 20:56:17.120721613 +0000 UTC" Apr 17 10:17:21.769651 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.769586 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13882h38m55.351138646s" Apr 17 10:17:21.790858 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.790827 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-hjlfj" Apr 17 10:17:21.796515 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.796492 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-hjlfj" Apr 17 10:17:21.843674 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.843613 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 10:17:21.859619 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.859592 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:21.960540 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:21.960504 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:21.979568 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:21.979343 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63dc35addb937cfe54aeec10c17385f5.slice/crio-46159f6e7b98dc2475ede7480461cf12f028d78ce587b90b7dc77711a88b9bfb WatchSource:0}: Error finding container 46159f6e7b98dc2475ede7480461cf12f028d78ce587b90b7dc77711a88b9bfb: Status 404 returned error can't find the container with id 46159f6e7b98dc2475ede7480461cf12f028d78ce587b90b7dc77711a88b9bfb Apr 17 10:17:21.980800 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:21.980297 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9b8fe15074523188d471c5f98f7d062.slice/crio-d8d632f697f8d08a677ae0d09a54fe1ede7edb7443b0d4a55bfd0941791abb12 WatchSource:0}: Error finding container d8d632f697f8d08a677ae0d09a54fe1ede7edb7443b0d4a55bfd0941791abb12: Status 404 returned error can't find the container with id d8d632f697f8d08a677ae0d09a54fe1ede7edb7443b0d4a55bfd0941791abb12 Apr 17 10:17:21.985773 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:21.985755 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 10:17:22.060611 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:22.060569 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-180.ec2.internal\" not found" Apr 17 10:17:22.088988 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.088960 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 10:17:22.160915 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.160824 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal" Apr 17 10:17:22.171504 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.171477 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 10:17:22.172521 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.172508 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" Apr 17 10:17:22.184096 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.184071 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 10:17:22.735243 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.735193 2579 apiserver.go:52] "Watching apiserver" Apr 17 10:17:22.741396 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.741360 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 10:17:22.741855 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.741826 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-hdfrm","openshift-image-registry/node-ca-qlxxc","openshift-multus/multus-v8cbj","kube-system/konnectivity-agent-8pm8f","kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal","openshift-multus/multus-additional-cni-plugins-jwf2j","openshift-multus/network-metrics-daemon-6q4fn","openshift-network-diagnostics/network-check-target-xzjpm","openshift-network-operator/iptables-alerter-pg58d","openshift-ovn-kubernetes/ovnkube-node-ggsw6","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq"] Apr 17 10:17:22.747621 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.747523 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:22.747621 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.747617 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:22.747831 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:22.747699 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:22.749900 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.749873 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.751042 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.750882 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gjf8s\"" Apr 17 10:17:22.751042 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.750881 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 10:17:22.751221 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.751160 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 10:17:22.752078 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.752056 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 10:17:22.752375 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.752356 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 10:17:22.752666 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.752648 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wrk6p\"" Apr 17 10:17:22.754245 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.753857 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:22.756007 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.755970 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 10:17:22.756104 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.756069 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 10:17:22.756373 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.756222 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tbk5h\"" Apr 17 10:17:22.756373 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.756335 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 10:17:22.756565 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.756372 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.758447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.758428 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 10:17:22.758541 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.758453 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 10:17:22.758602 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.758454 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-kzmwn\"" Apr 17 10:17:22.758602 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.758576 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 10:17:22.758721 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.758685 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 10:17:22.761871 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.761645 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.761871 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.761853 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:22.762005 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:22.761912 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:22.763942 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.763917 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:22.764096 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.764077 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-c4lzl\"" Apr 17 10:17:22.764491 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.764472 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 10:17:22.764578 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.764538 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 10:17:22.766936 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.766504 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.766936 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.766561 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 10:17:22.766936 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.766611 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 10:17:22.766936 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.766735 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 10:17:22.767199 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.767156 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dvbjs\"" Apr 17 10:17:22.768611 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.768563 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 10:17:22.768611 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.768585 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 10:17:22.769124 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.768879 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.770406 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.770378 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.770514 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.770445 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-run\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.770608 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.770488 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-var-lib-kubelet\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.770779 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.770760 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.770869 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.770809 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6pgj\" (UniqueName: \"kubernetes.io/projected/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-kube-api-access-x6pgj\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.770869 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.770851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp2bx\" (UniqueName: \"kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx\") pod \"network-check-target-xzjpm\" (UID: \"775480d3-f4fc-4837-9404-54348aa3f205\") " pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:22.770967 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.770890 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-sysconfig\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.771065 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-sys\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.771151 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771095 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8515f25-7800-40c5-a053-06805e22f902-cni-binary-copy\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.771151 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771125 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nsrf\" (UniqueName: \"kubernetes.io/projected/c8515f25-7800-40c5-a053-06805e22f902-kube-api-access-6nsrf\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.771233 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-systemd\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.771308 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771290 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-glgn2\"" Apr 17 10:17:22.771344 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771191 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-multus-cni-dir\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.771376 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771342 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-os-release\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.771456 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771400 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-var-lib-cni-bin\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.771456 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771442 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-cnibin\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.771546 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771483 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0-agent-certs\") pod \"konnectivity-agent-8pm8f\" (UID: \"9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0\") " pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:22.772627 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.772112 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 10:17:22.772627 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.772148 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 10:17:22.772627 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.772245 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-tv4kt\"" Apr 17 10:17:22.772627 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.772378 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 10:17:22.772627 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.772408 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.772941 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.772952 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.773022 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.771519 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774128 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-sysctl-conf\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-host\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774184 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-tuned\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774241 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72sd\" (UniqueName: \"kubernetes.io/projected/e8471c84-04de-44cd-a299-b4dc3d7003ab-kube-api-access-h72sd\") pod \"node-ca-qlxxc\" (UID: \"e8471c84-04de-44cd-a299-b4dc3d7003ab\") " pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774268 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-hostroot\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774466 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-multus-conf-dir\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774507 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-modprobe-d\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-lib-modules\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774570 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf6sj\" (UniqueName: \"kubernetes.io/projected/49bcd63d-9323-4368-9c4c-5edfc7b371f7-kube-api-access-bf6sj\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774608 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e8471c84-04de-44cd-a299-b4dc3d7003ab-serviceca\") pod \"node-ca-qlxxc\" (UID: \"e8471c84-04de-44cd-a299-b4dc3d7003ab\") " pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774639 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-cnibin\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774681 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8515f25-7800-40c5-a053-06805e22f902-multus-daemon-config\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774730 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-run-multus-certs\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.774807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774796 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-etc-kubernetes\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774862 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8471c84-04de-44cd-a299-b4dc3d7003ab-host\") pod \"node-ca-qlxxc\" (UID: \"e8471c84-04de-44cd-a299-b4dc3d7003ab\") " pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774898 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-system-cni-dir\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-multus-socket-dir-parent\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.774966 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775001 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-kubernetes\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775035 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-sysctl-d\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775060 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-run-k8s-cni-cncf-io\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775091 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-run-netns\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775121 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-cni-binary-copy\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775158 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0-konnectivity-ca\") pod \"konnectivity-agent-8pm8f\" (UID: \"9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0\") " pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775190 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clf4d\" (UniqueName: \"kubernetes.io/projected/964ce349-102e-4852-b1d8-a4c58565a4a0-kube-api-access-clf4d\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775207 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-var-lib-kubelet\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775237 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49bcd63d-9323-4368-9c4c-5edfc7b371f7-tmp\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775259 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-var-lib-cni-multus\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775277 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-system-cni-dir\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.776130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.775296 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-os-release\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.798959 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.798918 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 10:12:21 +0000 UTC" deadline="2027-10-14 19:19:31.109609543 +0000 UTC" Apr 17 10:17:22.798959 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.798952 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13089h2m8.310661717s" Apr 17 10:17:22.862157 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.862125 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 10:17:22.869520 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.869451 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" event={"ID":"e9b8fe15074523188d471c5f98f7d062","Type":"ContainerStarted","Data":"d8d632f697f8d08a677ae0d09a54fe1ede7edb7443b0d4a55bfd0941791abb12"} Apr 17 10:17:22.870499 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.870465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal" event={"ID":"63dc35addb937cfe54aeec10c17385f5","Type":"ContainerStarted","Data":"46159f6e7b98dc2475ede7480461cf12f028d78ce587b90b7dc77711a88b9bfb"} Apr 17 10:17:22.875728 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.875699 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-var-lib-kubelet\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.875728 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.875731 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49bcd63d-9323-4368-9c4c-5edfc7b371f7-tmp\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.875933 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.875757 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-system-cni-dir\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.875933 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.875781 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.875933 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.875825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6pgj\" (UniqueName: \"kubernetes.io/projected/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-kube-api-access-x6pgj\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.875933 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.875831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-var-lib-kubelet\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.875933 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.875829 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-system-cni-dir\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.875933 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.875855 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-var-lib-openvswitch\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.875933 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.875893 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c7kh\" (UniqueName: \"kubernetes.io/projected/7ec60248-7d9c-4930-984a-1104f6a4e7a0-kube-api-access-6c7kh\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.876259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876014 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-run\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.876259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876043 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-var-lib-kubelet\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.876259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876069 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.876259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876097 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2bx\" (UniqueName: \"kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx\") pod \"network-check-target-xzjpm\" (UID: \"775480d3-f4fc-4837-9404-54348aa3f205\") " pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:22.876259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876127 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgqs5\" (UniqueName: \"kubernetes.io/projected/f8ebcb33-32f1-469e-a5f3-c80aa250dc88-kube-api-access-tgqs5\") pod \"iptables-alerter-pg58d\" (UID: \"f8ebcb33-32f1-469e-a5f3-c80aa250dc88\") " pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:22.876259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876122 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 10:17:22.876259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876152 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-run\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.876259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876156 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-cni-netd\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.876259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876201 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-var-lib-kubelet\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.876259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876209 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-sysconfig\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-sys\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876342 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876397 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8515f25-7800-40c5-a053-06805e22f902-cni-binary-copy\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876437 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-run-ovn-kubernetes\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876455 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-sys\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876459 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-sysconfig\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-var-lib-cni-bin\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876494 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876513 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-cnibin\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876548 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8ebcb33-32f1-469e-a5f3-c80aa250dc88-host-slash\") pod \"iptables-alerter-pg58d\" (UID: \"f8ebcb33-32f1-469e-a5f3-c80aa250dc88\") " pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876577 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-cnibin\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876577 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876650 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-host\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876698 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-host\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.876737 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876709 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h72sd\" (UniqueName: \"kubernetes.io/projected/e8471c84-04de-44cd-a299-b4dc3d7003ab-kube-api-access-h72sd\") pod \"node-ca-qlxxc\" (UID: \"e8471c84-04de-44cd-a299-b4dc3d7003ab\") " pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876737 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-hostroot\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876796 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-cni-binary-copy\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876851 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0-konnectivity-ca\") pod \"konnectivity-agent-8pm8f\" (UID: \"9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0\") " pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876938 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-lib-modules\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e8471c84-04de-44cd-a299-b4dc3d7003ab-serviceca\") pod \"node-ca-qlxxc\" (UID: \"e8471c84-04de-44cd-a299-b4dc3d7003ab\") " pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.876514 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-var-lib-cni-bin\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877029 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-cnibin\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877119 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c8515f25-7800-40c5-a053-06805e22f902-cni-binary-copy\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877124 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-cnibin\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877143 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-node-log\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-lib-modules\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877192 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8471c84-04de-44cd-a299-b4dc3d7003ab-host\") pod \"node-ca-qlxxc\" (UID: \"e8471c84-04de-44cd-a299-b4dc3d7003ab\") " pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877234 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-hostroot\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877280 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e8471c84-04de-44cd-a299-b4dc3d7003ab-host\") pod \"node-ca-qlxxc\" (UID: \"e8471c84-04de-44cd-a299-b4dc3d7003ab\") " pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877295 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-multus-socket-dir-parent\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877436 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-slash\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.877447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877442 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-multus-socket-dir-parent\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877481 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-kubernetes\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877573 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-sysctl-d\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877599 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-run-k8s-cni-cncf-io\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877649 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-run-k8s-cni-cncf-io\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877650 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-kubernetes\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-run-netns\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877712 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-sysctl-d\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877729 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-run-systemd\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877734 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-cni-binary-copy\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877768 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-run-netns\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877767 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clf4d\" (UniqueName: \"kubernetes.io/projected/964ce349-102e-4852-b1d8-a4c58565a4a0-kube-api-access-clf4d\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877860 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877882 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-var-lib-cni-multus\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877912 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-os-release\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877923 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-var-lib-cni-multus\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.878102 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877935 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-cni-bin\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877956 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e6ec344-3806-4357-8d4e-229cc8d0f613-ovnkube-config\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877983 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-os-release\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.877975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8ebcb33-32f1-469e-a5f3-c80aa250dc88-iptables-alerter-script\") pod \"iptables-alerter-pg58d\" (UID: \"f8ebcb33-32f1-469e-a5f3-c80aa250dc88\") " pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878013 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e6ec344-3806-4357-8d4e-229cc8d0f613-env-overrides\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878042 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-socket-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878034 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0-konnectivity-ca\") pod \"konnectivity-agent-8pm8f\" (UID: \"9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0\") " pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878305 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-device-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878375 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nsrf\" (UniqueName: \"kubernetes.io/projected/c8515f25-7800-40c5-a053-06805e22f902-kube-api-access-6nsrf\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878410 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-systemd-units\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878460 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-etc-openvswitch\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878487 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-log-socket\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e6ec344-3806-4357-8d4e-229cc8d0f613-ovnkube-script-lib\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-kubelet\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-systemd\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-multus-cni-dir\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878635 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-os-release\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.878842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878664 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0-agent-certs\") pod \"konnectivity-agent-8pm8f\" (UID: \"9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0\") " pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:22.879622 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878690 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-run-netns\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.879622 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878728 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-run-openvswitch\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.879622 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:22.878865 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:22.879622 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.878950 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e8471c84-04de-44cd-a299-b4dc3d7003ab-serviceca\") pod \"node-ca-qlxxc\" (UID: \"e8471c84-04de-44cd-a299-b4dc3d7003ab\") " pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:22.879622 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:22.879072 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs podName:964ce349-102e-4852-b1d8-a4c58565a4a0 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:23.378912134 +0000 UTC m=+3.115098466 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs") pod "network-metrics-daemon-6q4fn" (UID: "964ce349-102e-4852-b1d8-a4c58565a4a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:22.879622 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.879214 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-multus-cni-dir\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.879622 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.879215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-systemd\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.879622 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.879443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-os-release\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.879988 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.879712 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:22.879988 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.879791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-sysctl-conf\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.879988 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.879823 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-tuned\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.879988 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.879851 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-multus-conf-dir\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.879988 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.879911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8515f25-7800-40c5-a053-06805e22f902-multus-daemon-config\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.879988 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.879956 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-run-ovn\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.880267 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880023 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e6ec344-3806-4357-8d4e-229cc8d0f613-ovn-node-metrics-cert\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.880267 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c84s\" (UniqueName: \"kubernetes.io/projected/9e6ec344-3806-4357-8d4e-229cc8d0f613-kube-api-access-6c84s\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.880267 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880159 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-modprobe-d\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.880267 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880184 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bf6sj\" (UniqueName: \"kubernetes.io/projected/49bcd63d-9323-4368-9c4c-5edfc7b371f7-kube-api-access-bf6sj\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.880267 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880212 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-run-multus-certs\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.880525 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880374 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-sysctl-conf\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.880525 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880438 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-etc-kubernetes\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.880525 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880475 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-registration-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.880525 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880520 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-system-cni-dir\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.880718 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880552 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-sys-fs\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.880718 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-host-run-multus-certs\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.880819 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880758 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/49bcd63d-9323-4368-9c4c-5edfc7b371f7-tmp\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.880819 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880762 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-modprobe-d\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.880819 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-multus-conf-dir\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.880970 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.880845 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-etc-kubernetes\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.883069 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.881157 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c8515f25-7800-40c5-a053-06805e22f902-system-cni-dir\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.883069 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:22.881795 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 10:17:22.883069 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:22.881818 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 10:17:22.883069 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:22.881838 2579 projected.go:194] Error preparing data for projected volume kube-api-access-wp2bx for pod openshift-network-diagnostics/network-check-target-xzjpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:22.883069 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:22.881914 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx podName:775480d3-f4fc-4837-9404-54348aa3f205 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:23.381889375 +0000 UTC m=+3.118075706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wp2bx" (UniqueName: "kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx") pod "network-check-target-xzjpm" (UID: "775480d3-f4fc-4837-9404-54348aa3f205") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:22.883069 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.881982 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c8515f25-7800-40c5-a053-06805e22f902-multus-daemon-config\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.884356 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.884285 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/49bcd63d-9323-4368-9c4c-5edfc7b371f7-etc-tuned\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.885568 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.885547 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0-agent-certs\") pod \"konnectivity-agent-8pm8f\" (UID: \"9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0\") " pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:22.885964 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.885919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72sd\" (UniqueName: \"kubernetes.io/projected/e8471c84-04de-44cd-a299-b4dc3d7003ab-kube-api-access-h72sd\") pod \"node-ca-qlxxc\" (UID: \"e8471c84-04de-44cd-a299-b4dc3d7003ab\") " pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:22.886059 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.885970 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6pgj\" (UniqueName: \"kubernetes.io/projected/c1879161-9d08-4f31-af72-d7e9b7b6b0c7-kube-api-access-x6pgj\") pod \"multus-additional-cni-plugins-jwf2j\" (UID: \"c1879161-9d08-4f31-af72-d7e9b7b6b0c7\") " pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:22.886857 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.886810 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clf4d\" (UniqueName: \"kubernetes.io/projected/964ce349-102e-4852-b1d8-a4c58565a4a0-kube-api-access-clf4d\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:22.887907 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.887864 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nsrf\" (UniqueName: \"kubernetes.io/projected/c8515f25-7800-40c5-a053-06805e22f902-kube-api-access-6nsrf\") pod \"multus-v8cbj\" (UID: \"c8515f25-7800-40c5-a053-06805e22f902\") " pod="openshift-multus/multus-v8cbj" Apr 17 10:17:22.891526 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.891505 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf6sj\" (UniqueName: \"kubernetes.io/projected/49bcd63d-9323-4368-9c4c-5edfc7b371f7-kube-api-access-bf6sj\") pod \"tuned-hdfrm\" (UID: \"49bcd63d-9323-4368-9c4c-5edfc7b371f7\") " pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:22.981049 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-slash\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981049 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981050 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.981280 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-run-systemd\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981280 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981128 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-cni-bin\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981280 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981170 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e6ec344-3806-4357-8d4e-229cc8d0f613-ovnkube-config\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981280 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981133 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-slash\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981280 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8ebcb33-32f1-469e-a5f3-c80aa250dc88-iptables-alerter-script\") pod \"iptables-alerter-pg58d\" (UID: \"f8ebcb33-32f1-469e-a5f3-c80aa250dc88\") " pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:22.981280 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-cni-bin\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981280 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e6ec344-3806-4357-8d4e-229cc8d0f613-env-overrides\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981280 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981231 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-run-systemd\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981280 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981199 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-etc-selinux\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.981280 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-socket-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981294 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-device-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-systemd-units\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-etc-openvswitch\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-log-socket\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981384 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-socket-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e6ec344-3806-4357-8d4e-229cc8d0f613-ovnkube-script-lib\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981448 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-kubelet\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981459 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-systemd-units\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981481 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-run-netns\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-run-openvswitch\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981502 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-etc-openvswitch\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-run-ovn\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981560 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-log-socket\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981579 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e6ec344-3806-4357-8d4e-229cc8d0f613-ovn-node-metrics-cert\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-run-netns\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c84s\" (UniqueName: \"kubernetes.io/projected/9e6ec344-3806-4357-8d4e-229cc8d0f613-kube-api-access-6c84s\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.981658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-registration-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-sys-fs\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-var-lib-openvswitch\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c7kh\" (UniqueName: \"kubernetes.io/projected/7ec60248-7d9c-4930-984a-1104f6a4e7a0-kube-api-access-6c7kh\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981745 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e6ec344-3806-4357-8d4e-229cc8d0f613-env-overrides\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981763 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgqs5\" (UniqueName: \"kubernetes.io/projected/f8ebcb33-32f1-469e-a5f3-c80aa250dc88-kube-api-access-tgqs5\") pod \"iptables-alerter-pg58d\" (UID: \"f8ebcb33-32f1-469e-a5f3-c80aa250dc88\") " pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981790 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-cni-netd\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981856 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e6ec344-3806-4357-8d4e-229cc8d0f613-ovnkube-config\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981859 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f8ebcb33-32f1-469e-a5f3-c80aa250dc88-iptables-alerter-script\") pod \"iptables-alerter-pg58d\" (UID: \"f8ebcb33-32f1-469e-a5f3-c80aa250dc88\") " pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-run-ovn-kubernetes\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-registration-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-run-openvswitch\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981915 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8ebcb33-32f1-469e-a5f3-c80aa250dc88-host-slash\") pod \"iptables-alerter-pg58d\" (UID: \"f8ebcb33-32f1-469e-a5f3-c80aa250dc88\") " pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981951 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-kubelet\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981960 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.981996 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-node-log\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982131 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982005 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-run-ovn-kubernetes\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982044 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-run-ovn\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982064 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e6ec344-3806-4357-8d4e-229cc8d0f613-ovnkube-script-lib\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982094 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.982691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982127 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-sys-fs\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.982691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982126 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7ec60248-7d9c-4930-984a-1104f6a4e7a0-device-dir\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.982691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982155 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f8ebcb33-32f1-469e-a5f3-c80aa250dc88-host-slash\") pod \"iptables-alerter-pg58d\" (UID: \"f8ebcb33-32f1-469e-a5f3-c80aa250dc88\") " pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:22.982691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982169 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-var-lib-openvswitch\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982170 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-node-log\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.982691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.982193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e6ec344-3806-4357-8d4e-229cc8d0f613-host-cni-netd\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.984362 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.984327 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e6ec344-3806-4357-8d4e-229cc8d0f613-ovn-node-metrics-cert\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.989644 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.989550 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c7kh\" (UniqueName: \"kubernetes.io/projected/7ec60248-7d9c-4930-984a-1104f6a4e7a0-kube-api-access-6c7kh\") pod \"aws-ebs-csi-driver-node-9v4bq\" (UID: \"7ec60248-7d9c-4930-984a-1104f6a4e7a0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:22.989644 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.989548 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c84s\" (UniqueName: \"kubernetes.io/projected/9e6ec344-3806-4357-8d4e-229cc8d0f613-kube-api-access-6c84s\") pod \"ovnkube-node-ggsw6\" (UID: \"9e6ec344-3806-4357-8d4e-229cc8d0f613\") " pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:22.989794 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:22.989694 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgqs5\" (UniqueName: \"kubernetes.io/projected/f8ebcb33-32f1-469e-a5f3-c80aa250dc88-kube-api-access-tgqs5\") pod \"iptables-alerter-pg58d\" (UID: \"f8ebcb33-32f1-469e-a5f3-c80aa250dc88\") " pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:23.062677 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.062642 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:23.070529 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.070497 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" Apr 17 10:17:23.081411 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.081378 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qlxxc" Apr 17 10:17:23.087230 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.087200 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v8cbj" Apr 17 10:17:23.094925 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.094895 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" Apr 17 10:17:23.102601 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.102572 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pg58d" Apr 17 10:17:23.109372 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.109352 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:23.116051 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.116032 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" Apr 17 10:17:23.257517 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.257444 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 10:17:23.384724 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.384693 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2bx\" (UniqueName: \"kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx\") pod \"network-check-target-xzjpm\" (UID: \"775480d3-f4fc-4837-9404-54348aa3f205\") " pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:23.384882 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.384760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:23.384882 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:23.384849 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 10:17:23.384882 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:23.384870 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 10:17:23.384882 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:23.384868 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:23.384882 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:23.384880 2579 projected.go:194] Error preparing data for projected volume kube-api-access-wp2bx for pod openshift-network-diagnostics/network-check-target-xzjpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:23.385088 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:23.384931 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs podName:964ce349-102e-4852-b1d8-a4c58565a4a0 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:24.384912797 +0000 UTC m=+4.121099128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs") pod "network-metrics-daemon-6q4fn" (UID: "964ce349-102e-4852-b1d8-a4c58565a4a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:23.385088 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:23.384948 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx podName:775480d3-f4fc-4837-9404-54348aa3f205 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:24.384940424 +0000 UTC m=+4.121126755 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wp2bx" (UniqueName: "kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx") pod "network-check-target-xzjpm" (UID: "775480d3-f4fc-4837-9404-54348aa3f205") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:23.657700 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:23.657657 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e6ec344_3806_4357_8d4e_229cc8d0f613.slice/crio-73aa8042732a80adf87762674ad135bffd3da1046a73ed11c192a03e168326b6 WatchSource:0}: Error finding container 73aa8042732a80adf87762674ad135bffd3da1046a73ed11c192a03e168326b6: Status 404 returned error can't find the container with id 73aa8042732a80adf87762674ad135bffd3da1046a73ed11c192a03e168326b6 Apr 17 10:17:23.659182 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:23.659158 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8471c84_04de_44cd_a299_b4dc3d7003ab.slice/crio-0bdeeffeca7708c0674ea0f47e8a56cc42a434e8a11a037e06f61253ad62e655 WatchSource:0}: Error finding container 0bdeeffeca7708c0674ea0f47e8a56cc42a434e8a11a037e06f61253ad62e655: Status 404 returned error can't find the container with id 0bdeeffeca7708c0674ea0f47e8a56cc42a434e8a11a037e06f61253ad62e655 Apr 17 10:17:23.659594 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:23.659570 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49bcd63d_9323_4368_9c4c_5edfc7b371f7.slice/crio-2e2b98b4a0cc556d2eedf96e00e155cf4567318fb24206f01f8eec37fcbcf96e WatchSource:0}: Error finding container 2e2b98b4a0cc556d2eedf96e00e155cf4567318fb24206f01f8eec37fcbcf96e: Status 404 returned error can't find the container with id 2e2b98b4a0cc556d2eedf96e00e155cf4567318fb24206f01f8eec37fcbcf96e Apr 17 10:17:23.682912 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:23.682837 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ec60248_7d9c_4930_984a_1104f6a4e7a0.slice/crio-d2f20bacf948f3ddfbc13b924f8c9361fcf2af98b3ce170af509b0d1350a6974 WatchSource:0}: Error finding container d2f20bacf948f3ddfbc13b924f8c9361fcf2af98b3ce170af509b0d1350a6974: Status 404 returned error can't find the container with id d2f20bacf948f3ddfbc13b924f8c9361fcf2af98b3ce170af509b0d1350a6974 Apr 17 10:17:23.684787 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:23.684766 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8515f25_7800_40c5_a053_06805e22f902.slice/crio-a1fc8da20ac835a4235fd33f22ea98f0c26c4ab177e318ae3a0dba1c74d69f74 WatchSource:0}: Error finding container a1fc8da20ac835a4235fd33f22ea98f0c26c4ab177e318ae3a0dba1c74d69f74: Status 404 returned error can't find the container with id a1fc8da20ac835a4235fd33f22ea98f0c26c4ab177e318ae3a0dba1c74d69f74 Apr 17 10:17:23.686272 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:23.686252 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1879161_9d08_4f31_af72_d7e9b7b6b0c7.slice/crio-0588ef33cb69a749878a1df559e66c1774f2fea67bbf528045e1e1300e01cf7f WatchSource:0}: Error finding container 0588ef33cb69a749878a1df559e66c1774f2fea67bbf528045e1e1300e01cf7f: Status 404 returned error can't find the container with id 0588ef33cb69a749878a1df559e66c1774f2fea67bbf528045e1e1300e01cf7f Apr 17 10:17:23.687440 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:23.687352 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8ebcb33_32f1_469e_a5f3_c80aa250dc88.slice/crio-5e798eee9af106a0795fb326f11b0ca28c2da3d9302b520b6e0c50f0333913dc WatchSource:0}: Error finding container 5e798eee9af106a0795fb326f11b0ca28c2da3d9302b520b6e0c50f0333913dc: Status 404 returned error can't find the container with id 5e798eee9af106a0795fb326f11b0ca28c2da3d9302b520b6e0c50f0333913dc Apr 17 10:17:23.689376 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:23.689324 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff2c5cc_5a1b_4207_b5da_e7cd37db94b0.slice/crio-eb4cd3b4961bc13972e023a52b4b8469e4e59f141779bad4e7965ff3feafea8f WatchSource:0}: Error finding container eb4cd3b4961bc13972e023a52b4b8469e4e59f141779bad4e7965ff3feafea8f: Status 404 returned error can't find the container with id eb4cd3b4961bc13972e023a52b4b8469e4e59f141779bad4e7965ff3feafea8f Apr 17 10:17:23.799717 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.799683 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 10:12:21 +0000 UTC" deadline="2027-10-05 16:15:55.261290234 +0000 UTC" Apr 17 10:17:23.799717 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.799712 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12869h58m31.46158075s" Apr 17 10:17:23.864143 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.864105 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:23.864308 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:23.864269 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:23.873359 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.873326 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal" event={"ID":"63dc35addb937cfe54aeec10c17385f5","Type":"ContainerStarted","Data":"c31182b8e44e8b271923aef987385b261c7eb29c771ff37382e825ddeed7fdd4"} Apr 17 10:17:23.874307 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.874284 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8pm8f" event={"ID":"9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0","Type":"ContainerStarted","Data":"eb4cd3b4961bc13972e023a52b4b8469e4e59f141779bad4e7965ff3feafea8f"} Apr 17 10:17:23.875227 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.875197 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v8cbj" event={"ID":"c8515f25-7800-40c5-a053-06805e22f902","Type":"ContainerStarted","Data":"a1fc8da20ac835a4235fd33f22ea98f0c26c4ab177e318ae3a0dba1c74d69f74"} Apr 17 10:17:23.876238 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.876202 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" event={"ID":"49bcd63d-9323-4368-9c4c-5edfc7b371f7","Type":"ContainerStarted","Data":"2e2b98b4a0cc556d2eedf96e00e155cf4567318fb24206f01f8eec37fcbcf96e"} Apr 17 10:17:23.877216 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.877195 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" event={"ID":"7ec60248-7d9c-4930-984a-1104f6a4e7a0","Type":"ContainerStarted","Data":"d2f20bacf948f3ddfbc13b924f8c9361fcf2af98b3ce170af509b0d1350a6974"} Apr 17 10:17:23.878095 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.878073 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerStarted","Data":"73aa8042732a80adf87762674ad135bffd3da1046a73ed11c192a03e168326b6"} Apr 17 10:17:23.878945 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.878927 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" event={"ID":"c1879161-9d08-4f31-af72-d7e9b7b6b0c7","Type":"ContainerStarted","Data":"0588ef33cb69a749878a1df559e66c1774f2fea67bbf528045e1e1300e01cf7f"} Apr 17 10:17:23.879761 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.879742 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pg58d" event={"ID":"f8ebcb33-32f1-469e-a5f3-c80aa250dc88","Type":"ContainerStarted","Data":"5e798eee9af106a0795fb326f11b0ca28c2da3d9302b520b6e0c50f0333913dc"} Apr 17 10:17:23.880525 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:23.880507 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qlxxc" event={"ID":"e8471c84-04de-44cd-a299-b4dc3d7003ab","Type":"ContainerStarted","Data":"0bdeeffeca7708c0674ea0f47e8a56cc42a434e8a11a037e06f61253ad62e655"} Apr 17 10:17:24.391281 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:24.391239 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:24.391470 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:24.391308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2bx\" (UniqueName: \"kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx\") pod \"network-check-target-xzjpm\" (UID: \"775480d3-f4fc-4837-9404-54348aa3f205\") " pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:24.391552 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:24.391478 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 10:17:24.391552 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:24.391495 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 10:17:24.391552 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:24.391476 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:24.391695 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:24.391575 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs podName:964ce349-102e-4852-b1d8-a4c58565a4a0 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:26.391556338 +0000 UTC m=+6.127742671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs") pod "network-metrics-daemon-6q4fn" (UID: "964ce349-102e-4852-b1d8-a4c58565a4a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:24.391695 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:24.391621 2579 projected.go:194] Error preparing data for projected volume kube-api-access-wp2bx for pod openshift-network-diagnostics/network-check-target-xzjpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:24.391695 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:24.391656 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx podName:775480d3-f4fc-4837-9404-54348aa3f205 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:26.391645385 +0000 UTC m=+6.127831717 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wp2bx" (UniqueName: "kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx") pod "network-check-target-xzjpm" (UID: "775480d3-f4fc-4837-9404-54348aa3f205") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:24.866799 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:24.866267 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:24.866799 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:24.866399 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:24.893672 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:24.893090 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" event={"ID":"e9b8fe15074523188d471c5f98f7d062","Type":"ContainerStarted","Data":"799aa8e34383106526445af617beb6b3c663d9aba738ffe48821dfaa6511b318"} Apr 17 10:17:24.906539 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:24.905956 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-180.ec2.internal" podStartSLOduration=2.905914653 podStartE2EDuration="2.905914653s" podCreationTimestamp="2026-04-17 10:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 10:17:23.884889398 +0000 UTC m=+3.621075744" watchObservedRunningTime="2026-04-17 10:17:24.905914653 +0000 UTC m=+4.642101007" Apr 17 10:17:25.864856 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:25.864816 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:25.865056 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:25.864972 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:25.901367 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:25.900598 2579 generic.go:358] "Generic (PLEG): container finished" podID="e9b8fe15074523188d471c5f98f7d062" containerID="799aa8e34383106526445af617beb6b3c663d9aba738ffe48821dfaa6511b318" exitCode=0 Apr 17 10:17:25.901367 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:25.900648 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" event={"ID":"e9b8fe15074523188d471c5f98f7d062","Type":"ContainerDied","Data":"799aa8e34383106526445af617beb6b3c663d9aba738ffe48821dfaa6511b318"} Apr 17 10:17:26.416333 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:26.416284 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2bx\" (UniqueName: \"kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx\") pod \"network-check-target-xzjpm\" (UID: \"775480d3-f4fc-4837-9404-54348aa3f205\") " pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:26.416540 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:26.416360 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:26.416540 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:26.416492 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:26.416658 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:26.416557 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs podName:964ce349-102e-4852-b1d8-a4c58565a4a0 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:30.416537714 +0000 UTC m=+10.152724048 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs") pod "network-metrics-daemon-6q4fn" (UID: "964ce349-102e-4852-b1d8-a4c58565a4a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:26.416658 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:26.416646 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 10:17:26.416761 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:26.416662 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 10:17:26.416761 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:26.416675 2579 projected.go:194] Error preparing data for projected volume kube-api-access-wp2bx for pod openshift-network-diagnostics/network-check-target-xzjpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:26.416761 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:26.416713 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx podName:775480d3-f4fc-4837-9404-54348aa3f205 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:30.416700094 +0000 UTC m=+10.152886429 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wp2bx" (UniqueName: "kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx") pod "network-check-target-xzjpm" (UID: "775480d3-f4fc-4837-9404-54348aa3f205") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:26.864815 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:26.864782 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:26.864999 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:26.864912 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:27.863943 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:27.863907 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:27.864393 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:27.864067 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:28.864325 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:28.864238 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:28.864796 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:28.864380 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:29.863757 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:29.863726 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:29.863939 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:29.863858 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:30.451352 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:30.450508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:30.451352 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:30.450577 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2bx\" (UniqueName: \"kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx\") pod \"network-check-target-xzjpm\" (UID: \"775480d3-f4fc-4837-9404-54348aa3f205\") " pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:30.451352 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:30.450754 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 10:17:30.451352 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:30.450771 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 10:17:30.451352 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:30.450785 2579 projected.go:194] Error preparing data for projected volume kube-api-access-wp2bx for pod openshift-network-diagnostics/network-check-target-xzjpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:30.451352 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:30.450846 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx podName:775480d3-f4fc-4837-9404-54348aa3f205 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:38.450827823 +0000 UTC m=+18.187014177 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wp2bx" (UniqueName: "kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx") pod "network-check-target-xzjpm" (UID: "775480d3-f4fc-4837-9404-54348aa3f205") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:30.451352 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:30.451262 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:30.451352 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:30.451311 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs podName:964ce349-102e-4852-b1d8-a4c58565a4a0 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:38.451295649 +0000 UTC m=+18.187481986 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs") pod "network-metrics-daemon-6q4fn" (UID: "964ce349-102e-4852-b1d8-a4c58565a4a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:30.864371 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:30.864329 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:30.864577 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:30.864484 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:31.864127 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:31.864086 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:31.864620 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:31.864228 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:32.869115 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:32.869088 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:32.869556 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:32.869199 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:33.864410 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:33.864368 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:33.864594 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:33.864536 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:34.864169 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:34.864133 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:34.864575 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:34.864275 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:35.863750 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:35.863718 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:35.863947 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:35.863857 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:36.864182 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:36.864149 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:36.864623 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:36.864273 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:37.648209 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.648157 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-6lwvd"] Apr 17 10:17:37.654953 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.654921 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:37.658376 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.658333 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 10:17:37.658698 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.658667 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 10:17:37.658807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.658704 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4cpp5\"" Apr 17 10:17:37.704545 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.704506 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/730f5024-a732-4fc5-b09b-dab4adee9607-hosts-file\") pod \"node-resolver-6lwvd\" (UID: \"730f5024-a732-4fc5-b09b-dab4adee9607\") " pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:37.704722 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.704561 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/730f5024-a732-4fc5-b09b-dab4adee9607-tmp-dir\") pod \"node-resolver-6lwvd\" (UID: \"730f5024-a732-4fc5-b09b-dab4adee9607\") " pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:37.704722 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.704614 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgr7d\" (UniqueName: \"kubernetes.io/projected/730f5024-a732-4fc5-b09b-dab4adee9607-kube-api-access-wgr7d\") pod \"node-resolver-6lwvd\" (UID: \"730f5024-a732-4fc5-b09b-dab4adee9607\") " pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:37.805388 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.805346 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/730f5024-a732-4fc5-b09b-dab4adee9607-hosts-file\") pod \"node-resolver-6lwvd\" (UID: \"730f5024-a732-4fc5-b09b-dab4adee9607\") " pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:37.805583 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.805404 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/730f5024-a732-4fc5-b09b-dab4adee9607-tmp-dir\") pod \"node-resolver-6lwvd\" (UID: \"730f5024-a732-4fc5-b09b-dab4adee9607\") " pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:37.805583 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.805450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgr7d\" (UniqueName: \"kubernetes.io/projected/730f5024-a732-4fc5-b09b-dab4adee9607-kube-api-access-wgr7d\") pod \"node-resolver-6lwvd\" (UID: \"730f5024-a732-4fc5-b09b-dab4adee9607\") " pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:37.805683 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.805451 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/730f5024-a732-4fc5-b09b-dab4adee9607-hosts-file\") pod \"node-resolver-6lwvd\" (UID: \"730f5024-a732-4fc5-b09b-dab4adee9607\") " pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:37.805745 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.805724 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/730f5024-a732-4fc5-b09b-dab4adee9607-tmp-dir\") pod \"node-resolver-6lwvd\" (UID: \"730f5024-a732-4fc5-b09b-dab4adee9607\") " pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:37.814700 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.814662 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgr7d\" (UniqueName: \"kubernetes.io/projected/730f5024-a732-4fc5-b09b-dab4adee9607-kube-api-access-wgr7d\") pod \"node-resolver-6lwvd\" (UID: \"730f5024-a732-4fc5-b09b-dab4adee9607\") " pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:37.864165 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.864128 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:37.864348 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:37.864266 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:37.966390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:37.966300 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6lwvd" Apr 17 10:17:38.510939 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:38.510902 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:38.511130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:38.510965 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2bx\" (UniqueName: \"kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx\") pod \"network-check-target-xzjpm\" (UID: \"775480d3-f4fc-4837-9404-54348aa3f205\") " pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:38.511130 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:38.511062 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:38.511130 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:38.511092 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 10:17:38.511130 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:38.511107 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 10:17:38.511130 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:38.511119 2579 projected.go:194] Error preparing data for projected volume kube-api-access-wp2bx for pod openshift-network-diagnostics/network-check-target-xzjpm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:38.511367 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:38.511134 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs podName:964ce349-102e-4852-b1d8-a4c58565a4a0 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:54.511111782 +0000 UTC m=+34.247298117 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs") pod "network-metrics-daemon-6q4fn" (UID: "964ce349-102e-4852-b1d8-a4c58565a4a0") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 10:17:38.511367 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:38.511169 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx podName:775480d3-f4fc-4837-9404-54348aa3f205 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:54.511152835 +0000 UTC m=+34.247339170 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wp2bx" (UniqueName: "kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx") pod "network-check-target-xzjpm" (UID: "775480d3-f4fc-4837-9404-54348aa3f205") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 10:17:38.863838 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:38.863798 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:38.864034 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:38.863922 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:39.864272 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:39.864232 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:39.864677 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:39.864390 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:40.864115 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.864078 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:40.864220 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:40.864177 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:40.926431 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.926384 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" event={"ID":"49bcd63d-9323-4368-9c4c-5edfc7b371f7","Type":"ContainerStarted","Data":"0616d9205ee867ecc0c91c794b4511e87526b299d1141d71d9c1e9b3149e88c3"} Apr 17 10:17:40.927830 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.927745 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" event={"ID":"7ec60248-7d9c-4930-984a-1104f6a4e7a0","Type":"ContainerStarted","Data":"da3da4ce964780cb542b3601621ee4fc03740375e5a5dd02131e683a8873cb71"} Apr 17 10:17:40.929447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.929411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerStarted","Data":"b1266f5fe36f3ed1935e1ba366e2f1a92f7d19fb80f08ad8db3c7b798a5643df"} Apr 17 10:17:40.929658 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.929456 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerStarted","Data":"84ed3c264179cb5669adf04a0542fff79f6144b94cb052249ddbf51de5153f7d"} Apr 17 10:17:40.933499 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.933477 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6lwvd" event={"ID":"730f5024-a732-4fc5-b09b-dab4adee9607","Type":"ContainerStarted","Data":"946b9d8fe03cb125366db71f9ce9860cfdafda47f0a1748b67e6bb4f1bfbda74"} Apr 17 10:17:40.933595 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.933512 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6lwvd" event={"ID":"730f5024-a732-4fc5-b09b-dab4adee9607","Type":"ContainerStarted","Data":"c5ba69bb88bcced06bb2bf9b9a291bdacf08a0979f1f56f449b823fcae08b06d"} Apr 17 10:17:40.934592 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.934570 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" event={"ID":"c1879161-9d08-4f31-af72-d7e9b7b6b0c7","Type":"ContainerStarted","Data":"441704c6ffb0f092f6384a9cb6db82f142c6911080c482191cb4e252ab966856"} Apr 17 10:17:40.935541 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.935521 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qlxxc" event={"ID":"e8471c84-04de-44cd-a299-b4dc3d7003ab","Type":"ContainerStarted","Data":"69a0335db01bba3c1ed149a3007caaec77e2858876f0e3244cb3b6c0f7176156"} Apr 17 10:17:40.936860 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.936841 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" event={"ID":"e9b8fe15074523188d471c5f98f7d062","Type":"ContainerStarted","Data":"32fb9e0c5c19b7077b62de24952a34207c73f21f72b45950bdb9111d7473ef62"} Apr 17 10:17:40.939766 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.938446 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8pm8f" event={"ID":"9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0","Type":"ContainerStarted","Data":"87ea13ac02d83bdd18b785c9f514abe091e914d8e094c90803abaa1f6b3bd5a4"} Apr 17 10:17:40.941807 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.941777 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v8cbj" event={"ID":"c8515f25-7800-40c5-a053-06805e22f902","Type":"ContainerStarted","Data":"1f3a36b5befad9f2553e03143123da446f756e3f2de59b72761e4ac50dcf0950"} Apr 17 10:17:40.963206 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.963163 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hdfrm" podStartSLOduration=3.229946316 podStartE2EDuration="19.963148665s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="2026-04-17 10:17:23.680999133 +0000 UTC m=+3.417185467" lastFinishedPulling="2026-04-17 10:17:40.414201481 +0000 UTC m=+20.150387816" observedRunningTime="2026-04-17 10:17:40.951247152 +0000 UTC m=+20.687433499" watchObservedRunningTime="2026-04-17 10:17:40.963148665 +0000 UTC m=+20.699335018" Apr 17 10:17:40.963362 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.963344 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6lwvd" podStartSLOduration=3.96333929 podStartE2EDuration="3.96333929s" podCreationTimestamp="2026-04-17 10:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 10:17:40.962660618 +0000 UTC m=+20.698846973" watchObservedRunningTime="2026-04-17 10:17:40.96333929 +0000 UTC m=+20.699525642" Apr 17 10:17:40.974526 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.974474 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-180.ec2.internal" podStartSLOduration=18.97445652 podStartE2EDuration="18.97445652s" podCreationTimestamp="2026-04-17 10:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 10:17:40.974310477 +0000 UTC m=+20.710496838" watchObservedRunningTime="2026-04-17 10:17:40.97445652 +0000 UTC m=+20.710642873" Apr 17 10:17:40.984667 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.984627 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8pm8f" podStartSLOduration=4.580245036 podStartE2EDuration="20.984613208s" podCreationTimestamp="2026-04-17 10:17:20 +0000 UTC" firstStartedPulling="2026-04-17 10:17:23.690824648 +0000 UTC m=+3.427010978" lastFinishedPulling="2026-04-17 10:17:40.095192816 +0000 UTC m=+19.831379150" observedRunningTime="2026-04-17 10:17:40.984531806 +0000 UTC m=+20.720718159" watchObservedRunningTime="2026-04-17 10:17:40.984613208 +0000 UTC m=+20.720799544" Apr 17 10:17:40.996676 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:40.996618 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qlxxc" podStartSLOduration=3.263542478 podStartE2EDuration="19.996598732s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="2026-04-17 10:17:23.681139079 +0000 UTC m=+3.417325417" lastFinishedPulling="2026-04-17 10:17:40.414195332 +0000 UTC m=+20.150381671" observedRunningTime="2026-04-17 10:17:40.995839798 +0000 UTC m=+20.732026151" watchObservedRunningTime="2026-04-17 10:17:40.996598732 +0000 UTC m=+20.732785087" Apr 17 10:17:41.864373 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.864343 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:41.864537 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:41.864479 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:41.945369 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.945346 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:17:41.945752 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.945608 2579 generic.go:358] "Generic (PLEG): container finished" podID="9e6ec344-3806-4357-8d4e-229cc8d0f613" containerID="b1266f5fe36f3ed1935e1ba366e2f1a92f7d19fb80f08ad8db3c7b798a5643df" exitCode=1 Apr 17 10:17:41.945752 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.945642 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerStarted","Data":"b17cc2416e61208a5e6695f57a5dd371c79c11c9a082d79ec430877968a1cbc3"} Apr 17 10:17:41.945752 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.945672 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerDied","Data":"b1266f5fe36f3ed1935e1ba366e2f1a92f7d19fb80f08ad8db3c7b798a5643df"} Apr 17 10:17:41.945752 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.945683 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerStarted","Data":"3700d7fd475cd10b22962eca515c892bcf58efa23bf20cff2b90d0029cf3f434"} Apr 17 10:17:41.945752 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.945691 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerStarted","Data":"0259deca2d3b8e819f30d5e35f1e2952943dc06bef44206ad2099d96c92015d5"} Apr 17 10:17:41.945752 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.945698 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerStarted","Data":"f652bb53943ee73755ea3714ca5e1a6435e54e5bc4e8d8f265b88abcd81c741a"} Apr 17 10:17:41.946899 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.946880 2579 generic.go:358] "Generic (PLEG): container finished" podID="c1879161-9d08-4f31-af72-d7e9b7b6b0c7" containerID="441704c6ffb0f092f6384a9cb6db82f142c6911080c482191cb4e252ab966856" exitCode=0 Apr 17 10:17:41.947009 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.946985 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" event={"ID":"c1879161-9d08-4f31-af72-d7e9b7b6b0c7","Type":"ContainerDied","Data":"441704c6ffb0f092f6384a9cb6db82f142c6911080c482191cb4e252ab966856"} Apr 17 10:17:41.963217 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:41.962944 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-v8cbj" podStartSLOduration=4.198139889 podStartE2EDuration="20.962927265s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="2026-04-17 10:17:23.687456169 +0000 UTC m=+3.423642499" lastFinishedPulling="2026-04-17 10:17:40.45224354 +0000 UTC m=+20.188429875" observedRunningTime="2026-04-17 10:17:41.027397415 +0000 UTC m=+20.763583768" watchObservedRunningTime="2026-04-17 10:17:41.962927265 +0000 UTC m=+21.699113619" Apr 17 10:17:42.278785 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:42.278753 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 10:17:42.835566 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:42.835436 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T10:17:42.278775198Z","UUID":"a83092f7-e5c7-4df5-9253-a23f457db8da","Handler":null,"Name":"","Endpoint":""} Apr 17 10:17:42.838562 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:42.838516 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 10:17:42.838562 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:42.838552 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 10:17:42.864607 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:42.864573 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:42.864788 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:42.864710 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:42.951016 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:42.950975 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pg58d" event={"ID":"f8ebcb33-32f1-469e-a5f3-c80aa250dc88","Type":"ContainerStarted","Data":"f7df4035ae591e02c85980cbda08f48bb5109075910dd2c0c4c2f16fb17a8df8"} Apr 17 10:17:42.953232 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:42.953200 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" event={"ID":"7ec60248-7d9c-4930-984a-1104f6a4e7a0","Type":"ContainerStarted","Data":"077b085bf5240e5cc842bb6b366e1485908dee8f1cb894adee83ea847463167d"} Apr 17 10:17:42.975365 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:42.975310 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-pg58d" podStartSLOduration=5.25087296 podStartE2EDuration="21.975295686s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="2026-04-17 10:17:23.68977413 +0000 UTC m=+3.425960475" lastFinishedPulling="2026-04-17 10:17:40.414196863 +0000 UTC m=+20.150383201" observedRunningTime="2026-04-17 10:17:42.974916183 +0000 UTC m=+22.711102536" watchObservedRunningTime="2026-04-17 10:17:42.975295686 +0000 UTC m=+22.711482039" Apr 17 10:17:43.864051 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:43.864017 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:43.864196 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:43.864171 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:43.957534 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:43.957440 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" event={"ID":"7ec60248-7d9c-4930-984a-1104f6a4e7a0","Type":"ContainerStarted","Data":"f136a3af511f3b411c52c3b4a86de9a210c9edc744e6b37158c3e84c5a650819"} Apr 17 10:17:43.961059 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:43.961027 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:17:43.961448 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:43.961404 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerStarted","Data":"b4b18f360316deb6132d60fe22a68e1113788d5ca027f965b2959460c0257866"} Apr 17 10:17:43.973200 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:43.973152 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9v4bq" podStartSLOduration=2.97097821 podStartE2EDuration="22.97313408s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="2026-04-17 10:17:23.687281778 +0000 UTC m=+3.423468110" lastFinishedPulling="2026-04-17 10:17:43.689437636 +0000 UTC m=+23.425623980" observedRunningTime="2026-04-17 10:17:43.973082505 +0000 UTC m=+23.709268858" watchObservedRunningTime="2026-04-17 10:17:43.97313408 +0000 UTC m=+23.709320415" Apr 17 10:17:44.864057 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:44.864017 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:44.864223 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:44.864159 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:45.562579 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:45.562541 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:45.563291 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:45.563269 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:45.864297 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:45.864258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:45.864486 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:45.864385 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:45.965134 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:45.965102 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:45.965695 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:45.965674 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8pm8f" Apr 17 10:17:46.864596 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:46.864385 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:46.865004 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:46.864682 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:46.970109 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:46.970078 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:17:46.970475 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:46.970442 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerStarted","Data":"b854b011506d650e2cf4156043a62efce9fe32d6cee90689a341cebcb9f4d5f0"} Apr 17 10:17:46.970815 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:46.970794 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:46.970991 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:46.970973 2579 scope.go:117] "RemoveContainer" containerID="b1266f5fe36f3ed1935e1ba366e2f1a92f7d19fb80f08ad8db3c7b798a5643df" Apr 17 10:17:46.972358 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:46.972335 2579 generic.go:358] "Generic (PLEG): container finished" podID="c1879161-9d08-4f31-af72-d7e9b7b6b0c7" containerID="3f06b7910a438dbfcfcafa4f258b0027b84a129d8fb379d2bd180403db5481c0" exitCode=0 Apr 17 10:17:46.972476 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:46.972439 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" event={"ID":"c1879161-9d08-4f31-af72-d7e9b7b6b0c7","Type":"ContainerDied","Data":"3f06b7910a438dbfcfcafa4f258b0027b84a129d8fb379d2bd180403db5481c0"} Apr 17 10:17:46.987472 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:46.987447 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:47.863957 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:47.863928 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:47.864121 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:47.864056 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:47.980118 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:47.980094 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:17:47.980529 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:47.980457 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" event={"ID":"9e6ec344-3806-4357-8d4e-229cc8d0f613","Type":"ContainerStarted","Data":"fa75ad72c87582e15950fcbbabf162c41c83b10fc44a074b6bdb2a351fb5bedb"} Apr 17 10:17:47.980895 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:47.980872 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:47.980895 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:47.980897 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:47.995078 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:47.995051 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:17:48.005842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:48.005795 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" podStartSLOduration=10.182324707 podStartE2EDuration="27.005780378s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="2026-04-17 10:17:23.681000642 +0000 UTC m=+3.417186973" lastFinishedPulling="2026-04-17 10:17:40.504456299 +0000 UTC m=+20.240642644" observedRunningTime="2026-04-17 10:17:48.004665283 +0000 UTC m=+27.740851637" watchObservedRunningTime="2026-04-17 10:17:48.005780378 +0000 UTC m=+27.741966731" Apr 17 10:17:48.353566 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:48.353533 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6q4fn"] Apr 17 10:17:48.353745 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:48.353681 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:48.353824 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:48.353800 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:48.356356 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:48.356327 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xzjpm"] Apr 17 10:17:48.356501 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:48.356452 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:48.356551 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:48.356533 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:48.983747 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:48.983708 2579 generic.go:358] "Generic (PLEG): container finished" podID="c1879161-9d08-4f31-af72-d7e9b7b6b0c7" containerID="19c292c964d0452278f4e67cf77694983307f08f37e8942d21235650ffa83f57" exitCode=0 Apr 17 10:17:48.984281 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:48.983789 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" event={"ID":"c1879161-9d08-4f31-af72-d7e9b7b6b0c7","Type":"ContainerDied","Data":"19c292c964d0452278f4e67cf77694983307f08f37e8942d21235650ffa83f57"} Apr 17 10:17:49.864654 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:49.864580 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:49.864800 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:49.864580 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:49.864800 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:49.864715 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:49.864800 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:49.864782 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:50.675277 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.673788 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-g42w5"] Apr 17 10:17:50.689572 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.689525 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-g42w5"] Apr 17 10:17:50.689721 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.689685 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:50.689782 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:50.689761 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-g42w5" podUID="9e8f4822-a90d-4e05-81a2-704623b16ed2" Apr 17 10:17:50.808296 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.808265 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:50.808476 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.808320 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9e8f4822-a90d-4e05-81a2-704623b16ed2-dbus\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:50.808476 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.808350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9e8f4822-a90d-4e05-81a2-704623b16ed2-kubelet-config\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:50.909646 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.909609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:50.909831 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.909663 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9e8f4822-a90d-4e05-81a2-704623b16ed2-dbus\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:50.909831 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.909681 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9e8f4822-a90d-4e05-81a2-704623b16ed2-kubelet-config\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:50.909831 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.909743 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9e8f4822-a90d-4e05-81a2-704623b16ed2-kubelet-config\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:50.909831 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:50.909753 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 10:17:50.909831 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:50.909812 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret podName:9e8f4822-a90d-4e05-81a2-704623b16ed2 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:51.409797501 +0000 UTC m=+31.145983833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret") pod "global-pull-secret-syncer-g42w5" (UID: "9e8f4822-a90d-4e05-81a2-704623b16ed2") : object "kube-system"/"original-pull-secret" not registered Apr 17 10:17:50.910012 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.909843 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9e8f4822-a90d-4e05-81a2-704623b16ed2-dbus\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:50.990474 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.990388 2579 generic.go:358] "Generic (PLEG): container finished" podID="c1879161-9d08-4f31-af72-d7e9b7b6b0c7" containerID="31dedea244f5e59afca624491303e45cd0c19407460059c8bd2257eca2211f8f" exitCode=0 Apr 17 10:17:50.990474 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:50.990453 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" event={"ID":"c1879161-9d08-4f31-af72-d7e9b7b6b0c7","Type":"ContainerDied","Data":"31dedea244f5e59afca624491303e45cd0c19407460059c8bd2257eca2211f8f"} Apr 17 10:17:51.413202 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:51.413167 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:51.413400 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:51.413266 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 10:17:51.413400 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:51.413333 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret podName:9e8f4822-a90d-4e05-81a2-704623b16ed2 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:52.413316188 +0000 UTC m=+32.149502523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret") pod "global-pull-secret-syncer-g42w5" (UID: "9e8f4822-a90d-4e05-81a2-704623b16ed2") : object "kube-system"/"original-pull-secret" not registered Apr 17 10:17:51.864686 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:51.864649 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:51.865186 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:51.864649 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:51.865186 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:51.864767 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6q4fn" podUID="964ce349-102e-4852-b1d8-a4c58565a4a0" Apr 17 10:17:51.865186 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:51.864838 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xzjpm" podUID="775480d3-f4fc-4837-9404-54348aa3f205" Apr 17 10:17:52.419181 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.419138 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:52.419364 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.419293 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 10:17:52.419364 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.419359 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret podName:9e8f4822-a90d-4e05-81a2-704623b16ed2 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:54.419344254 +0000 UTC m=+34.155530589 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret") pod "global-pull-secret-syncer-g42w5" (UID: "9e8f4822-a90d-4e05-81a2-704623b16ed2") : object "kube-system"/"original-pull-secret" not registered Apr 17 10:17:52.529762 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.529733 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-180.ec2.internal" event="NodeReady" Apr 17 10:17:52.529914 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.529883 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 10:17:52.564656 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.564613 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5"] Apr 17 10:17:52.588019 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.587936 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-566b6b7754-2cpbf"] Apr 17 10:17:52.588153 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.588078 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.590892 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.590862 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-lkm7m\"" Apr 17 10:17:52.590892 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.590897 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 10:17:52.591099 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.590922 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 10:17:52.591099 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.590876 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 10:17:52.591252 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.591224 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 10:17:52.617604 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.617570 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5"] Apr 17 10:17:52.617604 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.617599 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-566b6b7754-2cpbf"] Apr 17 10:17:52.617604 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.617610 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-nn582"] Apr 17 10:17:52.617866 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.617734 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.620985 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.620964 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 10:17:52.621156 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.621139 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 10:17:52.621470 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.621457 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 10:17:52.621702 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.621687 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-mw7t8\"" Apr 17 10:17:52.633973 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.633950 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 10:17:52.644372 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.644343 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gbbkk"] Apr 17 10:17:52.644576 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.644560 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:17:52.655289 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.655268 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tplzz\"" Apr 17 10:17:52.655628 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.655613 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 10:17:52.655812 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.655794 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 10:17:52.672636 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.672611 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-27dzv"] Apr 17 10:17:52.672761 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.672746 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:17:52.675033 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.675013 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 10:17:52.675247 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.675227 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 10:17:52.675350 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.675249 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 10:17:52.675350 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.675231 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xs5df\"" Apr 17 10:17:52.687237 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.687204 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6"] Apr 17 10:17:52.687437 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.687401 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.690395 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.690377 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 10:17:52.690496 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.690379 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-kq7pt\"" Apr 17 10:17:52.690917 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.690902 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 10:17:52.707043 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.707015 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gbbkk"] Apr 17 10:17:52.707043 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.707039 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-nn582"] Apr 17 10:17:52.707216 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.707053 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6"] Apr 17 10:17:52.707216 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.707064 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-27dzv"] Apr 17 10:17:52.707216 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.707151 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6" Apr 17 10:17:52.709923 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.709896 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 10:17:52.710040 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.709943 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-jrb2z\"" Apr 17 10:17:52.710040 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.709948 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 10:17:52.721110 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721090 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-trusted-ca\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.721204 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gll59\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-kube-api-access-gll59\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.721204 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721152 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-installation-pull-secrets\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.721204 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ccb66962-e736-41cf-aff2-701b47e39579-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:17:52.721298 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0837740f-2592-494a-bec7-ad834d8b557e-config\") pod \"service-ca-operator-d6fc45fc5-fhlm5\" (UID: \"0837740f-2592-494a-bec7-ad834d8b557e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.721298 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721228 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.721298 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlx5t\" (UniqueName: \"kubernetes.io/projected/0837740f-2592-494a-bec7-ad834d8b557e-kube-api-access-dlx5t\") pod \"service-ca-operator-d6fc45fc5-fhlm5\" (UID: \"0837740f-2592-494a-bec7-ad834d8b557e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.721298 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fff2f824-dc09-4d90-817f-04bff462678a-ca-trust-extracted\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.721298 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721282 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-registry-certificates\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.721464 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721314 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:17:52.721464 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-image-registry-private-configuration\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.721464 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721355 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0837740f-2592-494a-bec7-ad834d8b557e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fhlm5\" (UID: \"0837740f-2592-494a-bec7-ad834d8b557e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.721464 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.721376 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-bound-sa-token\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.822103 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822060 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-installation-pull-secrets\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.822103 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822097 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ccb66962-e736-41cf-aff2-701b47e39579-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:17:52.822350 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822116 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0837740f-2592-494a-bec7-ad834d8b557e-config\") pod \"service-ca-operator-d6fc45fc5-fhlm5\" (UID: \"0837740f-2592-494a-bec7-ad834d8b557e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.822350 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.822350 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822264 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlx5t\" (UniqueName: \"kubernetes.io/projected/0837740f-2592-494a-bec7-ad834d8b557e-kube-api-access-dlx5t\") pod \"service-ca-operator-d6fc45fc5-fhlm5\" (UID: \"0837740f-2592-494a-bec7-ad834d8b557e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.822350 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822286 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:17:52.822350 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822319 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fff2f824-dc09-4d90-817f-04bff462678a-ca-trust-extracted\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.822632 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.822364 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 10:17:52.822632 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.822379 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-566b6b7754-2cpbf: secret "image-registry-tls" not found Apr 17 10:17:52.822632 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-registry-certificates\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.822632 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822471 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdt77\" (UniqueName: \"kubernetes.io/projected/7715462b-006b-40d6-ba3b-5b84dfb6caeb-kube-api-access-hdt77\") pod \"network-check-source-8894fc9bd-vf6k6\" (UID: \"7715462b-006b-40d6-ba3b-5b84dfb6caeb\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6" Apr 17 10:17:52.822632 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.822482 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls podName:fff2f824-dc09-4d90-817f-04bff462678a nodeName:}" failed. No retries permitted until 2026-04-17 10:17:53.322462611 +0000 UTC m=+33.058648947 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls") pod "image-registry-566b6b7754-2cpbf" (UID: "fff2f824-dc09-4d90-817f-04bff462678a") : secret "image-registry-tls" not found Apr 17 10:17:52.822632 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822510 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-config-volume\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.822632 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822534 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-tmp-dir\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.822632 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822613 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.823026 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822648 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb9ps\" (UniqueName: \"kubernetes.io/projected/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-kube-api-access-tb9ps\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.823026 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:17:52.823026 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cqq\" (UniqueName: \"kubernetes.io/projected/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-kube-api-access-85cqq\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:17:52.823026 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.822794 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 10:17:52.823026 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0837740f-2592-494a-bec7-ad834d8b557e-config\") pod \"service-ca-operator-d6fc45fc5-fhlm5\" (UID: \"0837740f-2592-494a-bec7-ad834d8b557e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.823026 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.822841 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert podName:ccb66962-e736-41cf-aff2-701b47e39579 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:53.322824809 +0000 UTC m=+33.059011165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nn582" (UID: "ccb66962-e736-41cf-aff2-701b47e39579") : secret "networking-console-plugin-cert" not found Apr 17 10:17:52.823026 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-image-registry-private-configuration\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.823026 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.822987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0837740f-2592-494a-bec7-ad834d8b557e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fhlm5\" (UID: \"0837740f-2592-494a-bec7-ad834d8b557e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.823026 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.823022 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-bound-sa-token\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.823447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.823029 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fff2f824-dc09-4d90-817f-04bff462678a-ca-trust-extracted\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.823447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.823061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-trusted-ca\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.823447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.823090 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gll59\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-kube-api-access-gll59\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.823447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.823092 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-registry-certificates\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.823447 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.823283 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ccb66962-e736-41cf-aff2-701b47e39579-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:17:52.823931 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.823909 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-trusted-ca\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.826379 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.826354 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0837740f-2592-494a-bec7-ad834d8b557e-serving-cert\") pod \"service-ca-operator-d6fc45fc5-fhlm5\" (UID: \"0837740f-2592-494a-bec7-ad834d8b557e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.826509 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.826488 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-image-registry-private-configuration\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.826564 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.826526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-installation-pull-secrets\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.833508 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.833478 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-bound-sa-token\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.833628 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.833545 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlx5t\" (UniqueName: \"kubernetes.io/projected/0837740f-2592-494a-bec7-ad834d8b557e-kube-api-access-dlx5t\") pod \"service-ca-operator-d6fc45fc5-fhlm5\" (UID: \"0837740f-2592-494a-bec7-ad834d8b557e\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.833926 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.833909 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gll59\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-kube-api-access-gll59\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:52.864313 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.864221 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:52.866832 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.866811 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 10:17:52.900147 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.900112 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" Apr 17 10:17:52.924184 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.924151 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:17:52.924184 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.924187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdt77\" (UniqueName: \"kubernetes.io/projected/7715462b-006b-40d6-ba3b-5b84dfb6caeb-kube-api-access-hdt77\") pod \"network-check-source-8894fc9bd-vf6k6\" (UID: \"7715462b-006b-40d6-ba3b-5b84dfb6caeb\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6" Apr 17 10:17:52.924386 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.924205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-config-volume\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.924386 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.924292 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 10:17:52.924386 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.924329 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-tmp-dir\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.924386 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.924364 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert podName:50cdff83-6754-4d62-8a0c-bfe31b13a7eb nodeName:}" failed. No retries permitted until 2026-04-17 10:17:53.424346519 +0000 UTC m=+33.160532853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert") pod "ingress-canary-gbbkk" (UID: "50cdff83-6754-4d62-8a0c-bfe31b13a7eb") : secret "canary-serving-cert" not found Apr 17 10:17:52.924587 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.924429 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.924587 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.924462 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tb9ps\" (UniqueName: \"kubernetes.io/projected/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-kube-api-access-tb9ps\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.924587 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.924500 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85cqq\" (UniqueName: \"kubernetes.io/projected/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-kube-api-access-85cqq\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:17:52.924587 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.924557 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 10:17:52.924746 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:52.924629 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls podName:156b0fef-9a3a-4de8-81ea-13ac2cb45c19 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:53.424610953 +0000 UTC m=+33.160797293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls") pod "dns-default-27dzv" (UID: "156b0fef-9a3a-4de8-81ea-13ac2cb45c19") : secret "dns-default-metrics-tls" not found Apr 17 10:17:52.924746 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.924722 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-tmp-dir\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.924981 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.924962 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-config-volume\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:52.933024 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.932989 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cqq\" (UniqueName: \"kubernetes.io/projected/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-kube-api-access-85cqq\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:17:52.933167 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.932990 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdt77\" (UniqueName: \"kubernetes.io/projected/7715462b-006b-40d6-ba3b-5b84dfb6caeb-kube-api-access-hdt77\") pod \"network-check-source-8894fc9bd-vf6k6\" (UID: \"7715462b-006b-40d6-ba3b-5b84dfb6caeb\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6" Apr 17 10:17:52.933167 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:52.933068 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb9ps\" (UniqueName: \"kubernetes.io/projected/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-kube-api-access-tb9ps\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:53.015977 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.015799 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6" Apr 17 10:17:53.071099 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.070283 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5"] Apr 17 10:17:53.075654 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:53.075618 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0837740f_2592_494a_bec7_ad834d8b557e.slice/crio-34cfcf8160f989fc285a47639ed86252a0f2f77e6f70acc0406f30f495aa4def WatchSource:0}: Error finding container 34cfcf8160f989fc285a47639ed86252a0f2f77e6f70acc0406f30f495aa4def: Status 404 returned error can't find the container with id 34cfcf8160f989fc285a47639ed86252a0f2f77e6f70acc0406f30f495aa4def Apr 17 10:17:53.141284 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.141168 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6"] Apr 17 10:17:53.146110 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:53.146080 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7715462b_006b_40d6_ba3b_5b84dfb6caeb.slice/crio-a722676d91b5aaf156e42bbf0fe09ec8a55926c39783c489b54f08248f33f990 WatchSource:0}: Error finding container a722676d91b5aaf156e42bbf0fe09ec8a55926c39783c489b54f08248f33f990: Status 404 returned error can't find the container with id a722676d91b5aaf156e42bbf0fe09ec8a55926c39783c489b54f08248f33f990 Apr 17 10:17:53.328487 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.328444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:17:53.328676 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.328551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:53.328676 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:53.328595 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 10:17:53.328676 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:53.328664 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert podName:ccb66962-e736-41cf-aff2-701b47e39579 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:54.328647862 +0000 UTC m=+34.064834194 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nn582" (UID: "ccb66962-e736-41cf-aff2-701b47e39579") : secret "networking-console-plugin-cert" not found Apr 17 10:17:53.328817 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:53.328698 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 10:17:53.328817 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:53.328722 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-566b6b7754-2cpbf: secret "image-registry-tls" not found Apr 17 10:17:53.328817 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:53.328774 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls podName:fff2f824-dc09-4d90-817f-04bff462678a nodeName:}" failed. No retries permitted until 2026-04-17 10:17:54.32876003 +0000 UTC m=+34.064946361 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls") pod "image-registry-566b6b7754-2cpbf" (UID: "fff2f824-dc09-4d90-817f-04bff462678a") : secret "image-registry-tls" not found Apr 17 10:17:53.429775 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.429687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:17:53.429775 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.429739 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:53.429948 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:53.429843 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 10:17:53.429948 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:53.429911 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert podName:50cdff83-6754-4d62-8a0c-bfe31b13a7eb nodeName:}" failed. No retries permitted until 2026-04-17 10:17:54.42989293 +0000 UTC m=+34.166079292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert") pod "ingress-canary-gbbkk" (UID: "50cdff83-6754-4d62-8a0c-bfe31b13a7eb") : secret "canary-serving-cert" not found Apr 17 10:17:53.430012 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:53.429954 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 10:17:53.430012 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:53.429988 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls podName:156b0fef-9a3a-4de8-81ea-13ac2cb45c19 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:54.429978438 +0000 UTC m=+34.166164769 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls") pod "dns-default-27dzv" (UID: "156b0fef-9a3a-4de8-81ea-13ac2cb45c19") : secret "dns-default-metrics-tls" not found Apr 17 10:17:53.866221 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.865885 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:53.866395 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.865885 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:53.868717 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.868690 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 10:17:53.869142 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.868761 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-lxdvh\"" Apr 17 10:17:53.869404 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.869329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k4g8r\"" Apr 17 10:17:53.998321 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.998280 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6" event={"ID":"7715462b-006b-40d6-ba3b-5b84dfb6caeb","Type":"ContainerStarted","Data":"a722676d91b5aaf156e42bbf0fe09ec8a55926c39783c489b54f08248f33f990"} Apr 17 10:17:53.999544 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:53.999508 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" event={"ID":"0837740f-2592-494a-bec7-ad834d8b557e","Type":"ContainerStarted","Data":"34cfcf8160f989fc285a47639ed86252a0f2f77e6f70acc0406f30f495aa4def"} Apr 17 10:17:54.339379 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.339336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:54.339604 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.339454 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:17:54.339604 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.339589 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 10:17:54.339604 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.339594 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 10:17:54.339778 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.339616 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-566b6b7754-2cpbf: secret "image-registry-tls" not found Apr 17 10:17:54.339778 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.339666 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert podName:ccb66962-e736-41cf-aff2-701b47e39579 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:56.339645913 +0000 UTC m=+36.075832250 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nn582" (UID: "ccb66962-e736-41cf-aff2-701b47e39579") : secret "networking-console-plugin-cert" not found Apr 17 10:17:54.339778 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.339685 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls podName:fff2f824-dc09-4d90-817f-04bff462678a nodeName:}" failed. No retries permitted until 2026-04-17 10:17:56.339675534 +0000 UTC m=+36.075861871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls") pod "image-registry-566b6b7754-2cpbf" (UID: "fff2f824-dc09-4d90-817f-04bff462678a") : secret "image-registry-tls" not found Apr 17 10:17:54.440588 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.440538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:54.440777 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.440603 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:17:54.440777 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.440652 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:54.440777 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.440754 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 10:17:54.440777 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.440773 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 10:17:54.440934 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.440824 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls podName:156b0fef-9a3a-4de8-81ea-13ac2cb45c19 nodeName:}" failed. No retries permitted until 2026-04-17 10:17:56.440809376 +0000 UTC m=+36.176995709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls") pod "dns-default-27dzv" (UID: "156b0fef-9a3a-4de8-81ea-13ac2cb45c19") : secret "dns-default-metrics-tls" not found Apr 17 10:17:54.440934 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.440839 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert podName:50cdff83-6754-4d62-8a0c-bfe31b13a7eb nodeName:}" failed. No retries permitted until 2026-04-17 10:17:56.440832637 +0000 UTC m=+36.177018968 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert") pod "ingress-canary-gbbkk" (UID: "50cdff83-6754-4d62-8a0c-bfe31b13a7eb") : secret "canary-serving-cert" not found Apr 17 10:17:54.444505 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.444470 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9e8f4822-a90d-4e05-81a2-704623b16ed2-original-pull-secret\") pod \"global-pull-secret-syncer-g42w5\" (UID: \"9e8f4822-a90d-4e05-81a2-704623b16ed2\") " pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:54.541566 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.541525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp2bx\" (UniqueName: \"kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx\") pod \"network-check-target-xzjpm\" (UID: \"775480d3-f4fc-4837-9404-54348aa3f205\") " pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:54.541719 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.541647 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:17:54.541805 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.541763 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 10:17:54.541859 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:54.541834 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs podName:964ce349-102e-4852-b1d8-a4c58565a4a0 nodeName:}" failed. No retries permitted until 2026-04-17 10:18:26.541811505 +0000 UTC m=+66.277997840 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs") pod "network-metrics-daemon-6q4fn" (UID: "964ce349-102e-4852-b1d8-a4c58565a4a0") : secret "metrics-daemon-secret" not found Apr 17 10:17:54.544866 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.544838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp2bx\" (UniqueName: \"kubernetes.io/projected/775480d3-f4fc-4837-9404-54348aa3f205-kube-api-access-wp2bx\") pod \"network-check-target-xzjpm\" (UID: \"775480d3-f4fc-4837-9404-54348aa3f205\") " pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:54.674103 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.674014 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-g42w5" Apr 17 10:17:54.788047 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:54.788010 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:55.192789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:55.192759 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xzjpm"] Apr 17 10:17:55.195434 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:55.195397 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-g42w5"] Apr 17 10:17:55.195974 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:55.195946 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod775480d3_f4fc_4837_9404_54348aa3f205.slice/crio-11ef76310d40635631ef1f6f0ea22c001b7311c1eecf5ee87227bcf10c335333 WatchSource:0}: Error finding container 11ef76310d40635631ef1f6f0ea22c001b7311c1eecf5ee87227bcf10c335333: Status 404 returned error can't find the container with id 11ef76310d40635631ef1f6f0ea22c001b7311c1eecf5ee87227bcf10c335333 Apr 17 10:17:55.198474 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:17:55.198444 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e8f4822_a90d_4e05_81a2_704623b16ed2.slice/crio-3dcb3ec39d9e120fba36834456eb150ea4289a2d5a5223dc59ed43be5e89736a WatchSource:0}: Error finding container 3dcb3ec39d9e120fba36834456eb150ea4289a2d5a5223dc59ed43be5e89736a: Status 404 returned error can't find the container with id 3dcb3ec39d9e120fba36834456eb150ea4289a2d5a5223dc59ed43be5e89736a Apr 17 10:17:56.004696 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:56.004650 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-g42w5" event={"ID":"9e8f4822-a90d-4e05-81a2-704623b16ed2","Type":"ContainerStarted","Data":"3dcb3ec39d9e120fba36834456eb150ea4289a2d5a5223dc59ed43be5e89736a"} Apr 17 10:17:56.005858 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:56.005823 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xzjpm" event={"ID":"775480d3-f4fc-4837-9404-54348aa3f205","Type":"ContainerStarted","Data":"11ef76310d40635631ef1f6f0ea22c001b7311c1eecf5ee87227bcf10c335333"} Apr 17 10:17:56.360567 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:56.360521 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:17:56.361033 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:56.360647 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:17:56.361033 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:56.360690 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 10:17:56.361033 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:56.360750 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 10:17:56.361033 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:56.360761 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-566b6b7754-2cpbf: secret "image-registry-tls" not found Apr 17 10:17:56.361033 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:56.360792 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert podName:ccb66962-e736-41cf-aff2-701b47e39579 nodeName:}" failed. No retries permitted until 2026-04-17 10:18:00.36077006 +0000 UTC m=+40.096956391 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nn582" (UID: "ccb66962-e736-41cf-aff2-701b47e39579") : secret "networking-console-plugin-cert" not found Apr 17 10:17:56.361033 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:56.360817 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls podName:fff2f824-dc09-4d90-817f-04bff462678a nodeName:}" failed. No retries permitted until 2026-04-17 10:18:00.360803587 +0000 UTC m=+40.096989924 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls") pod "image-registry-566b6b7754-2cpbf" (UID: "fff2f824-dc09-4d90-817f-04bff462678a") : secret "image-registry-tls" not found Apr 17 10:17:56.461354 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:56.461298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:17:56.461560 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:56.461365 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:17:56.461560 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:56.461542 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 10:17:56.461560 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:56.461544 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 10:17:56.461726 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:56.461604 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls podName:156b0fef-9a3a-4de8-81ea-13ac2cb45c19 nodeName:}" failed. No retries permitted until 2026-04-17 10:18:00.461585923 +0000 UTC m=+40.197772258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls") pod "dns-default-27dzv" (UID: "156b0fef-9a3a-4de8-81ea-13ac2cb45c19") : secret "dns-default-metrics-tls" not found Apr 17 10:17:56.461726 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:17:56.461622 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert podName:50cdff83-6754-4d62-8a0c-bfe31b13a7eb nodeName:}" failed. No retries permitted until 2026-04-17 10:18:00.461613133 +0000 UTC m=+40.197799467 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert") pod "ingress-canary-gbbkk" (UID: "50cdff83-6754-4d62-8a0c-bfe31b13a7eb") : secret "canary-serving-cert" not found Apr 17 10:17:57.011477 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:57.011118 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6" event={"ID":"7715462b-006b-40d6-ba3b-5b84dfb6caeb","Type":"ContainerStarted","Data":"cf3eed070ed0630fe843e5836d3912035df7d6f44a9314cb3af0c42f3ce782fd"} Apr 17 10:17:57.013705 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:57.013678 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" event={"ID":"0837740f-2592-494a-bec7-ad834d8b557e","Type":"ContainerStarted","Data":"d6357b18aadea88eb8faa5a70f51d7d98c1c1a360ca8bbe9a11cfc2392d7a8f8"} Apr 17 10:17:57.018896 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:57.018866 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xzjpm" event={"ID":"775480d3-f4fc-4837-9404-54348aa3f205","Type":"ContainerStarted","Data":"bed90dd6af996443490a2c7b0a995147cce86c0432cf92da4673705d87d39981"} Apr 17 10:17:57.019037 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:57.019002 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:17:57.027711 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:57.027661 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-vf6k6" podStartSLOduration=32.300295948 podStartE2EDuration="36.027645099s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="2026-04-17 10:17:53.147513471 +0000 UTC m=+32.883699819" lastFinishedPulling="2026-04-17 10:17:56.874862623 +0000 UTC m=+36.611048970" observedRunningTime="2026-04-17 10:17:57.026880536 +0000 UTC m=+36.763066890" watchObservedRunningTime="2026-04-17 10:17:57.027645099 +0000 UTC m=+36.763831454" Apr 17 10:17:57.045791 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:57.045731 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" podStartSLOduration=32.543221833 podStartE2EDuration="36.045714646s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="2026-04-17 10:17:53.07810779 +0000 UTC m=+32.814294120" lastFinishedPulling="2026-04-17 10:17:56.580600592 +0000 UTC m=+36.316786933" observedRunningTime="2026-04-17 10:17:57.044469136 +0000 UTC m=+36.780655488" watchObservedRunningTime="2026-04-17 10:17:57.045714646 +0000 UTC m=+36.781900998" Apr 17 10:17:57.065482 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:17:57.064450 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xzjpm" podStartSLOduration=34.387533642 podStartE2EDuration="36.064429508s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="2026-04-17 10:17:55.198371527 +0000 UTC m=+34.934557860" lastFinishedPulling="2026-04-17 10:17:56.875267387 +0000 UTC m=+36.611453726" observedRunningTime="2026-04-17 10:17:57.064105534 +0000 UTC m=+36.800291888" watchObservedRunningTime="2026-04-17 10:17:57.064429508 +0000 UTC m=+36.800615853" Apr 17 10:18:00.397129 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:00.397078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:18:00.397956 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:00.397185 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:18:00.397956 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:00.397253 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 10:18:00.397956 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:00.397279 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-566b6b7754-2cpbf: secret "image-registry-tls" not found Apr 17 10:18:00.397956 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:00.397346 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls podName:fff2f824-dc09-4d90-817f-04bff462678a nodeName:}" failed. No retries permitted until 2026-04-17 10:18:08.397325215 +0000 UTC m=+48.133511551 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls") pod "image-registry-566b6b7754-2cpbf" (UID: "fff2f824-dc09-4d90-817f-04bff462678a") : secret "image-registry-tls" not found Apr 17 10:18:00.397956 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:00.397356 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 10:18:00.397956 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:00.397436 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert podName:ccb66962-e736-41cf-aff2-701b47e39579 nodeName:}" failed. No retries permitted until 2026-04-17 10:18:08.397400011 +0000 UTC m=+48.133586368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nn582" (UID: "ccb66962-e736-41cf-aff2-701b47e39579") : secret "networking-console-plugin-cert" not found Apr 17 10:18:00.498185 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:00.498141 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:18:00.498473 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:00.498207 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:18:00.498473 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:00.498354 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 10:18:00.498473 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:00.498443 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls podName:156b0fef-9a3a-4de8-81ea-13ac2cb45c19 nodeName:}" failed. No retries permitted until 2026-04-17 10:18:08.498403738 +0000 UTC m=+48.234590077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls") pod "dns-default-27dzv" (UID: "156b0fef-9a3a-4de8-81ea-13ac2cb45c19") : secret "dns-default-metrics-tls" not found Apr 17 10:18:00.498473 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:00.498356 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 10:18:00.498639 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:00.498496 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert podName:50cdff83-6754-4d62-8a0c-bfe31b13a7eb nodeName:}" failed. No retries permitted until 2026-04-17 10:18:08.498482514 +0000 UTC m=+48.234668860 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert") pod "ingress-canary-gbbkk" (UID: "50cdff83-6754-4d62-8a0c-bfe31b13a7eb") : secret "canary-serving-cert" not found Apr 17 10:18:01.525022 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.524986 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5h75w"] Apr 17 10:18:01.528078 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.528047 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:01.530551 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.530524 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 10:18:01.530691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.530578 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 10:18:01.531587 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.531566 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 10:18:01.531587 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.531574 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 10:18:01.531747 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.531626 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-7dgwt\"" Apr 17 10:18:01.536827 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.536802 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5h75w"] Apr 17 10:18:01.708484 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.708447 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4acfea86-c51d-4bc7-a81a-fe43884aca68-signing-key\") pod \"service-ca-865cb79987-5h75w\" (UID: \"4acfea86-c51d-4bc7-a81a-fe43884aca68\") " pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:01.708661 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.708542 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4acfea86-c51d-4bc7-a81a-fe43884aca68-signing-cabundle\") pod \"service-ca-865cb79987-5h75w\" (UID: \"4acfea86-c51d-4bc7-a81a-fe43884aca68\") " pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:01.708661 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.708564 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2f2w\" (UniqueName: \"kubernetes.io/projected/4acfea86-c51d-4bc7-a81a-fe43884aca68-kube-api-access-h2f2w\") pod \"service-ca-865cb79987-5h75w\" (UID: \"4acfea86-c51d-4bc7-a81a-fe43884aca68\") " pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:01.809393 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.809308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4acfea86-c51d-4bc7-a81a-fe43884aca68-signing-cabundle\") pod \"service-ca-865cb79987-5h75w\" (UID: \"4acfea86-c51d-4bc7-a81a-fe43884aca68\") " pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:01.809393 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.809349 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2f2w\" (UniqueName: \"kubernetes.io/projected/4acfea86-c51d-4bc7-a81a-fe43884aca68-kube-api-access-h2f2w\") pod \"service-ca-865cb79987-5h75w\" (UID: \"4acfea86-c51d-4bc7-a81a-fe43884aca68\") " pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:01.809623 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.809445 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4acfea86-c51d-4bc7-a81a-fe43884aca68-signing-key\") pod \"service-ca-865cb79987-5h75w\" (UID: \"4acfea86-c51d-4bc7-a81a-fe43884aca68\") " pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:01.810056 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.810031 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/4acfea86-c51d-4bc7-a81a-fe43884aca68-signing-cabundle\") pod \"service-ca-865cb79987-5h75w\" (UID: \"4acfea86-c51d-4bc7-a81a-fe43884aca68\") " pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:01.813857 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.813834 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/4acfea86-c51d-4bc7-a81a-fe43884aca68-signing-key\") pod \"service-ca-865cb79987-5h75w\" (UID: \"4acfea86-c51d-4bc7-a81a-fe43884aca68\") " pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:01.817892 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.817867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2f2w\" (UniqueName: \"kubernetes.io/projected/4acfea86-c51d-4bc7-a81a-fe43884aca68-kube-api-access-h2f2w\") pod \"service-ca-865cb79987-5h75w\" (UID: \"4acfea86-c51d-4bc7-a81a-fe43884aca68\") " pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:01.839847 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:01.839811 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5h75w" Apr 17 10:18:03.019778 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:03.019551 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5h75w"] Apr 17 10:18:03.024106 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:03.024080 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4acfea86_c51d_4bc7_a81a_fe43884aca68.slice/crio-d9c4748d9876d0225dee6db81147117052d71262da326e15f0938069e000a6d7 WatchSource:0}: Error finding container d9c4748d9876d0225dee6db81147117052d71262da326e15f0938069e000a6d7: Status 404 returned error can't find the container with id d9c4748d9876d0225dee6db81147117052d71262da326e15f0938069e000a6d7 Apr 17 10:18:03.036886 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:03.036857 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5h75w" event={"ID":"4acfea86-c51d-4bc7-a81a-fe43884aca68","Type":"ContainerStarted","Data":"d9c4748d9876d0225dee6db81147117052d71262da326e15f0938069e000a6d7"} Apr 17 10:18:04.040626 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:04.040589 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-g42w5" event={"ID":"9e8f4822-a90d-4e05-81a2-704623b16ed2","Type":"ContainerStarted","Data":"6d5d691369a01ebbdf8b9bbc80c66eee0eae4fcc6ae456d744d0df5226bc6aef"} Apr 17 10:18:04.041890 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:04.041863 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5h75w" event={"ID":"4acfea86-c51d-4bc7-a81a-fe43884aca68","Type":"ContainerStarted","Data":"9b3126a8404efe8e0fc82d6478f44b6be27d501fbfd7b6b2fc0f17f0b89acb8b"} Apr 17 10:18:04.044067 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:04.044037 2579 generic.go:358] "Generic (PLEG): container finished" podID="c1879161-9d08-4f31-af72-d7e9b7b6b0c7" containerID="31c78a52e986a0c2c82c2381526826f51b5d99bcc00696e90ff55ddcf26fb953" exitCode=0 Apr 17 10:18:04.044184 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:04.044072 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" event={"ID":"c1879161-9d08-4f31-af72-d7e9b7b6b0c7","Type":"ContainerDied","Data":"31c78a52e986a0c2c82c2381526826f51b5d99bcc00696e90ff55ddcf26fb953"} Apr 17 10:18:04.054702 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:04.054641 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-g42w5" podStartSLOduration=6.363508436 podStartE2EDuration="14.05462491s" podCreationTimestamp="2026-04-17 10:17:50 +0000 UTC" firstStartedPulling="2026-04-17 10:17:55.200482384 +0000 UTC m=+34.936668715" lastFinishedPulling="2026-04-17 10:18:02.891598855 +0000 UTC m=+42.627785189" observedRunningTime="2026-04-17 10:18:04.05421137 +0000 UTC m=+43.790397725" watchObservedRunningTime="2026-04-17 10:18:04.05462491 +0000 UTC m=+43.790811264" Apr 17 10:18:04.069999 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:04.069941 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-5h75w" podStartSLOduration=3.069925235 podStartE2EDuration="3.069925235s" podCreationTimestamp="2026-04-17 10:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 10:18:04.06890473 +0000 UTC m=+43.805091084" watchObservedRunningTime="2026-04-17 10:18:04.069925235 +0000 UTC m=+43.806111582" Apr 17 10:18:05.049264 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:05.049229 2579 generic.go:358] "Generic (PLEG): container finished" podID="c1879161-9d08-4f31-af72-d7e9b7b6b0c7" containerID="751c4b8e3c80dfe21b5239f4f9b60d6ce9311d578a106cff61e5e7b66383746b" exitCode=0 Apr 17 10:18:05.049671 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:05.049312 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" event={"ID":"c1879161-9d08-4f31-af72-d7e9b7b6b0c7","Type":"ContainerDied","Data":"751c4b8e3c80dfe21b5239f4f9b60d6ce9311d578a106cff61e5e7b66383746b"} Apr 17 10:18:06.055320 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:06.055285 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" event={"ID":"c1879161-9d08-4f31-af72-d7e9b7b6b0c7","Type":"ContainerStarted","Data":"603df9beea2d3b45ef46a3b2344962caeb0e0b6fbbf98d54c80e6137a17b1b7d"} Apr 17 10:18:06.082381 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:06.082322 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jwf2j" podStartSLOduration=5.86297405 podStartE2EDuration="45.082303664s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="2026-04-17 10:17:23.68801654 +0000 UTC m=+3.424202877" lastFinishedPulling="2026-04-17 10:18:02.907346151 +0000 UTC m=+42.643532491" observedRunningTime="2026-04-17 10:18:06.080984305 +0000 UTC m=+45.817170666" watchObservedRunningTime="2026-04-17 10:18:06.082303664 +0000 UTC m=+45.818490018" Apr 17 10:18:08.466871 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.466827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:18:08.467315 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:08.467001 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 10:18:08.467315 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.467014 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:18:08.467315 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:18:08.467086 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert podName:ccb66962-e736-41cf-aff2-701b47e39579 nodeName:}" failed. No retries permitted until 2026-04-17 10:18:24.467062417 +0000 UTC m=+64.203248751 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nn582" (UID: "ccb66962-e736-41cf-aff2-701b47e39579") : secret "networking-console-plugin-cert" not found Apr 17 10:18:08.469374 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.469356 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls\") pod \"image-registry-566b6b7754-2cpbf\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:18:08.525615 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.525573 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:18:08.567911 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.567878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:18:08.567911 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.567922 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:18:08.571625 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.571373 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/156b0fef-9a3a-4de8-81ea-13ac2cb45c19-metrics-tls\") pod \"dns-default-27dzv\" (UID: \"156b0fef-9a3a-4de8-81ea-13ac2cb45c19\") " pod="openshift-dns/dns-default-27dzv" Apr 17 10:18:08.571625 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.571449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/50cdff83-6754-4d62-8a0c-bfe31b13a7eb-cert\") pod \"ingress-canary-gbbkk\" (UID: \"50cdff83-6754-4d62-8a0c-bfe31b13a7eb\") " pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:18:08.588357 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.587875 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gbbkk" Apr 17 10:18:08.596083 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.596059 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-27dzv" Apr 17 10:18:08.653733 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.653677 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-566b6b7754-2cpbf"] Apr 17 10:18:08.658026 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:08.657974 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff2f824_dc09_4d90_817f_04bff462678a.slice/crio-17f0a29dae2b51d3727d7f59c8f4f972e93a7b5d63b127cdaf341367a90fe9d3 WatchSource:0}: Error finding container 17f0a29dae2b51d3727d7f59c8f4f972e93a7b5d63b127cdaf341367a90fe9d3: Status 404 returned error can't find the container with id 17f0a29dae2b51d3727d7f59c8f4f972e93a7b5d63b127cdaf341367a90fe9d3 Apr 17 10:18:08.722165 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.722137 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gbbkk"] Apr 17 10:18:08.727021 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:08.726988 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50cdff83_6754_4d62_8a0c_bfe31b13a7eb.slice/crio-f608f6aedf3306ec628a2d8a5cd0b684f59f0cc0234f59cbb5caebef20c52129 WatchSource:0}: Error finding container f608f6aedf3306ec628a2d8a5cd0b684f59f0cc0234f59cbb5caebef20c52129: Status 404 returned error can't find the container with id f608f6aedf3306ec628a2d8a5cd0b684f59f0cc0234f59cbb5caebef20c52129 Apr 17 10:18:08.741024 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:08.740993 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-27dzv"] Apr 17 10:18:08.744031 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:08.744000 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod156b0fef_9a3a_4de8_81ea_13ac2cb45c19.slice/crio-bf8d026c97d3caa5e84a0603dbac442786a47cb85ca03bbd3782cc6a40012088 WatchSource:0}: Error finding container bf8d026c97d3caa5e84a0603dbac442786a47cb85ca03bbd3782cc6a40012088: Status 404 returned error can't find the container with id bf8d026c97d3caa5e84a0603dbac442786a47cb85ca03bbd3782cc6a40012088 Apr 17 10:18:09.067147 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:09.067112 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" event={"ID":"fff2f824-dc09-4d90-817f-04bff462678a","Type":"ContainerStarted","Data":"a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca"} Apr 17 10:18:09.067147 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:09.067147 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" event={"ID":"fff2f824-dc09-4d90-817f-04bff462678a","Type":"ContainerStarted","Data":"17f0a29dae2b51d3727d7f59c8f4f972e93a7b5d63b127cdaf341367a90fe9d3"} Apr 17 10:18:09.067382 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:09.067207 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:18:09.068218 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:09.068189 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-27dzv" event={"ID":"156b0fef-9a3a-4de8-81ea-13ac2cb45c19","Type":"ContainerStarted","Data":"bf8d026c97d3caa5e84a0603dbac442786a47cb85ca03bbd3782cc6a40012088"} Apr 17 10:18:09.069084 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:09.069062 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gbbkk" event={"ID":"50cdff83-6754-4d62-8a0c-bfe31b13a7eb","Type":"ContainerStarted","Data":"f608f6aedf3306ec628a2d8a5cd0b684f59f0cc0234f59cbb5caebef20c52129"} Apr 17 10:18:09.089631 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:09.089574 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" podStartSLOduration=48.089555563 podStartE2EDuration="48.089555563s" podCreationTimestamp="2026-04-17 10:17:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 10:18:09.087736276 +0000 UTC m=+48.823922641" watchObservedRunningTime="2026-04-17 10:18:09.089555563 +0000 UTC m=+48.825741917" Apr 17 10:18:12.079968 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:12.079931 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gbbkk" event={"ID":"50cdff83-6754-4d62-8a0c-bfe31b13a7eb","Type":"ContainerStarted","Data":"e32fd4869a6615be77f2f122ab0164dc71beacc898ba8aaf81606aa3b3b6908f"} Apr 17 10:18:12.081499 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:12.081465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-27dzv" event={"ID":"156b0fef-9a3a-4de8-81ea-13ac2cb45c19","Type":"ContainerStarted","Data":"daeba612d2ae0e16236e3bb1ff625ffff2a75c9b805c6ffb0b66c9bae863f9be"} Apr 17 10:18:12.081499 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:12.081497 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-27dzv" event={"ID":"156b0fef-9a3a-4de8-81ea-13ac2cb45c19","Type":"ContainerStarted","Data":"82e56aec47d2f1280de3939777a81352d294eb1d2c0bbc4e931fb4d88a1f90ab"} Apr 17 10:18:12.081641 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:12.081597 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-27dzv" Apr 17 10:18:12.097730 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:12.097670 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gbbkk" podStartSLOduration=17.371128748 podStartE2EDuration="20.097652261s" podCreationTimestamp="2026-04-17 10:17:52 +0000 UTC" firstStartedPulling="2026-04-17 10:18:08.729229466 +0000 UTC m=+48.465415801" lastFinishedPulling="2026-04-17 10:18:11.455752981 +0000 UTC m=+51.191939314" observedRunningTime="2026-04-17 10:18:12.096919346 +0000 UTC m=+51.833105692" watchObservedRunningTime="2026-04-17 10:18:12.097652261 +0000 UTC m=+51.833838615" Apr 17 10:18:12.113963 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:12.113911 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-27dzv" podStartSLOduration=17.400378257 podStartE2EDuration="20.113897051s" podCreationTimestamp="2026-04-17 10:17:52 +0000 UTC" firstStartedPulling="2026-04-17 10:18:08.746011261 +0000 UTC m=+48.482197597" lastFinishedPulling="2026-04-17 10:18:11.459530057 +0000 UTC m=+51.195716391" observedRunningTime="2026-04-17 10:18:12.11293315 +0000 UTC m=+51.849119505" watchObservedRunningTime="2026-04-17 10:18:12.113897051 +0000 UTC m=+51.850083403" Apr 17 10:18:20.002756 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:20.002709 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ggsw6" Apr 17 10:18:22.086834 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.086801 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-27dzv" Apr 17 10:18:22.824584 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.824547 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw"] Apr 17 10:18:22.862498 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.862456 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw"] Apr 17 10:18:22.862660 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.862598 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" Apr 17 10:18:22.865027 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.865001 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 10:18:22.865183 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.865040 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 10:18:22.865434 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.865376 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-qtdb2\"" Apr 17 10:18:22.865569 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.865462 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 10:18:22.930680 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.930648 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt"] Apr 17 10:18:22.947872 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.947838 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5c64q"] Apr 17 10:18:22.948079 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.948056 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt" Apr 17 10:18:22.950543 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.950518 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 10:18:22.950543 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.950537 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-6mk9b\"" Apr 17 10:18:22.950751 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.950569 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 10:18:22.965641 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.965598 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z"] Apr 17 10:18:22.965812 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.965759 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:22.970332 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.970307 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 10:18:22.973927 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.973907 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 10:18:22.974155 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.974099 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-r2r5q\"" Apr 17 10:18:22.974668 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.974646 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 10:18:22.975770 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.975748 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4g66\" (UniqueName: \"kubernetes.io/projected/60bcfb22-9d7f-4ed5-894f-3be5795bb21e-kube-api-access-z4g66\") pod \"volume-data-source-validator-7c6cbb6c87-ssrzt\" (UID: \"60bcfb22-9d7f-4ed5-894f-3be5795bb21e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt" Apr 17 10:18:22.975901 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.975787 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhft8\" (UniqueName: \"kubernetes.io/projected/13e5ced9-25b8-4b3f-9a0d-ac11761a6140-kube-api-access-vhft8\") pod \"cluster-samples-operator-6dc5bdb6b4-h4frw\" (UID: \"13e5ced9-25b8-4b3f-9a0d-ac11761a6140\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" Apr 17 10:18:22.975901 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.975825 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/13e5ced9-25b8-4b3f-9a0d-ac11761a6140-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-h4frw\" (UID: \"13e5ced9-25b8-4b3f-9a0d-ac11761a6140\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" Apr 17 10:18:22.977653 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.977631 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 10:18:22.983774 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.983754 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nskql"] Apr 17 10:18:22.983920 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.983904 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:22.992286 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.992265 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 10:18:22.995827 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.995806 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-75b9fd94fb-6dvbt"] Apr 17 10:18:22.995955 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.995940 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:22.996744 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:22.996724 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 10:18:23.000257 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.000223 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qwzgr\"" Apr 17 10:18:23.000634 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.000612 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 10:18:23.001871 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.001852 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 10:18:23.001989 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.001940 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 10:18:23.002067 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.001988 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 10:18:23.002067 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.002031 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 10:18:23.002432 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.002395 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 10:18:23.002603 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.002570 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 10:18:23.003449 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.003388 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4xrl6\"" Apr 17 10:18:23.009381 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.009355 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt"] Apr 17 10:18:23.009381 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.009380 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z"] Apr 17 10:18:23.009542 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.009398 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nskql"] Apr 17 10:18:23.009542 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.009408 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-75b9fd94fb-6dvbt"] Apr 17 10:18:23.009542 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.009515 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5c64q"] Apr 17 10:18:23.009542 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.009521 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.016149 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.016120 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 10:18:23.016360 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.016149 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 10:18:23.016360 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.016162 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 10:18:23.016360 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.016155 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 10:18:23.016360 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.016289 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-5wzqc\"" Apr 17 10:18:23.016360 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.016324 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 10:18:23.016679 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.016665 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 10:18:23.029016 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.028994 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 10:18:23.031196 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.031175 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn"] Apr 17 10:18:23.045627 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.045599 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.048292 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.048272 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 10:18:23.048292 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.048285 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 10:18:23.048606 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.048588 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 17 10:18:23.050013 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.050000 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jk2pg\"" Apr 17 10:18:23.053009 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.052990 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 17 10:18:23.054523 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.054501 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn"] Apr 17 10:18:23.077097 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077021 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5bf035ab-c3f4-4ce5-bd46-9b55d72879b7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ndgrn\" (UID: \"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.077097 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077062 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-trusted-ca\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.077324 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077103 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4thj\" (UniqueName: \"kubernetes.io/projected/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-kube-api-access-g4thj\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.077324 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077174 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4g66\" (UniqueName: \"kubernetes.io/projected/60bcfb22-9d7f-4ed5-894f-3be5795bb21e-kube-api-access-z4g66\") pod \"volume-data-source-validator-7c6cbb6c87-ssrzt\" (UID: \"60bcfb22-9d7f-4ed5-894f-3be5795bb21e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt" Apr 17 10:18:23.077324 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077208 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0970ce61-ac90-4b77-8b8b-3e0f732cce33-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-pwt5z\" (UID: \"0970ce61-ac90-4b77-8b8b-3e0f732cce33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:23.077324 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077233 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5bf035ab-c3f4-4ce5-bd46-9b55d72879b7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ndgrn\" (UID: \"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.077324 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-default-certificate\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.077324 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077284 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-service-ca-bundle\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.077541 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/68dbc54c-11d7-4146-ac64-235b136dbc33-snapshots\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.077541 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077459 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-config\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.077541 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjz6c\" (UniqueName: \"kubernetes.io/projected/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-kube-api-access-cjz6c\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.077541 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68dbc54c-11d7-4146-ac64-235b136dbc33-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.077666 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077576 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9plcc\" (UniqueName: \"kubernetes.io/projected/68dbc54c-11d7-4146-ac64-235b136dbc33-kube-api-access-9plcc\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.077666 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077596 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/68dbc54c-11d7-4146-ac64-235b136dbc33-tmp\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.077666 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077644 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jd59\" (UniqueName: \"kubernetes.io/projected/5bf035ab-c3f4-4ce5-bd46-9b55d72879b7-kube-api-access-8jd59\") pod \"cluster-monitoring-operator-75587bd455-ndgrn\" (UID: \"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.077758 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-serving-cert\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.077758 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077712 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-stats-auth\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.077758 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077750 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vhft8\" (UniqueName: \"kubernetes.io/projected/13e5ced9-25b8-4b3f-9a0d-ac11761a6140-kube-api-access-vhft8\") pod \"cluster-samples-operator-6dc5bdb6b4-h4frw\" (UID: \"13e5ced9-25b8-4b3f-9a0d-ac11761a6140\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" Apr 17 10:18:23.077842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077767 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68dbc54c-11d7-4146-ac64-235b136dbc33-service-ca-bundle\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.077842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077787 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/13e5ced9-25b8-4b3f-9a0d-ac11761a6140-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-h4frw\" (UID: \"13e5ced9-25b8-4b3f-9a0d-ac11761a6140\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" Apr 17 10:18:23.077842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077808 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68dbc54c-11d7-4146-ac64-235b136dbc33-serving-cert\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.077842 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-metrics-certs\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.078005 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077846 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0970ce61-ac90-4b77-8b8b-3e0f732cce33-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-pwt5z\" (UID: \"0970ce61-ac90-4b77-8b8b-3e0f732cce33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:23.078005 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.077867 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2rfq\" (UniqueName: \"kubernetes.io/projected/0970ce61-ac90-4b77-8b8b-3e0f732cce33-kube-api-access-h2rfq\") pod \"kube-storage-version-migrator-operator-6769c5d45-pwt5z\" (UID: \"0970ce61-ac90-4b77-8b8b-3e0f732cce33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:23.080289 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.080258 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/13e5ced9-25b8-4b3f-9a0d-ac11761a6140-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-h4frw\" (UID: \"13e5ced9-25b8-4b3f-9a0d-ac11761a6140\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" Apr 17 10:18:23.101080 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.101049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhft8\" (UniqueName: \"kubernetes.io/projected/13e5ced9-25b8-4b3f-9a0d-ac11761a6140-kube-api-access-vhft8\") pod \"cluster-samples-operator-6dc5bdb6b4-h4frw\" (UID: \"13e5ced9-25b8-4b3f-9a0d-ac11761a6140\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" Apr 17 10:18:23.101542 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.101523 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4g66\" (UniqueName: \"kubernetes.io/projected/60bcfb22-9d7f-4ed5-894f-3be5795bb21e-kube-api-access-z4g66\") pod \"volume-data-source-validator-7c6cbb6c87-ssrzt\" (UID: \"60bcfb22-9d7f-4ed5-894f-3be5795bb21e\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt" Apr 17 10:18:23.172305 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.172264 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.178827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0970ce61-ac90-4b77-8b8b-3e0f732cce33-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-pwt5z\" (UID: \"0970ce61-ac90-4b77-8b8b-3e0f732cce33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.178873 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5bf035ab-c3f4-4ce5-bd46-9b55d72879b7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ndgrn\" (UID: \"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.178899 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-default-certificate\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.178926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-service-ca-bundle\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.178954 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/68dbc54c-11d7-4146-ac64-235b136dbc33-snapshots\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.178975 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-config\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjz6c\" (UniqueName: \"kubernetes.io/projected/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-kube-api-access-cjz6c\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68dbc54c-11d7-4146-ac64-235b136dbc33-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179070 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9plcc\" (UniqueName: \"kubernetes.io/projected/68dbc54c-11d7-4146-ac64-235b136dbc33-kube-api-access-9plcc\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179093 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/68dbc54c-11d7-4146-ac64-235b136dbc33-tmp\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jd59\" (UniqueName: \"kubernetes.io/projected/5bf035ab-c3f4-4ce5-bd46-9b55d72879b7-kube-api-access-8jd59\") pod \"cluster-monitoring-operator-75587bd455-ndgrn\" (UID: \"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179150 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-serving-cert\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179175 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-stats-auth\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179213 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68dbc54c-11d7-4146-ac64-235b136dbc33-service-ca-bundle\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179249 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68dbc54c-11d7-4146-ac64-235b136dbc33-serving-cert\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.179597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179273 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-metrics-certs\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.180391 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0970ce61-ac90-4b77-8b8b-3e0f732cce33-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-pwt5z\" (UID: \"0970ce61-ac90-4b77-8b8b-3e0f732cce33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:23.180391 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179322 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2rfq\" (UniqueName: \"kubernetes.io/projected/0970ce61-ac90-4b77-8b8b-3e0f732cce33-kube-api-access-h2rfq\") pod \"kube-storage-version-migrator-operator-6769c5d45-pwt5z\" (UID: \"0970ce61-ac90-4b77-8b8b-3e0f732cce33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:23.180391 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179367 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5bf035ab-c3f4-4ce5-bd46-9b55d72879b7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ndgrn\" (UID: \"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.180391 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179392 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-trusted-ca\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.180391 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.179450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4thj\" (UniqueName: \"kubernetes.io/projected/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-kube-api-access-g4thj\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.180894 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.180797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/68dbc54c-11d7-4146-ac64-235b136dbc33-tmp\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.181659 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.181528 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68dbc54c-11d7-4146-ac64-235b136dbc33-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.181659 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.181586 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/68dbc54c-11d7-4146-ac64-235b136dbc33-snapshots\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.181829 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.181753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-service-ca-bundle\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.182192 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.182171 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-config\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.182339 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.182311 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5bf035ab-c3f4-4ce5-bd46-9b55d72879b7-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ndgrn\" (UID: \"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.182587 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.182315 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-trusted-ca\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.182937 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.182759 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68dbc54c-11d7-4146-ac64-235b136dbc33-service-ca-bundle\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.183078 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.182951 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0970ce61-ac90-4b77-8b8b-3e0f732cce33-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-pwt5z\" (UID: \"0970ce61-ac90-4b77-8b8b-3e0f732cce33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:23.184191 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.184166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-default-certificate\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.185174 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.185138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-metrics-certs\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.185338 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.185312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68dbc54c-11d7-4146-ac64-235b136dbc33-serving-cert\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.185530 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.185450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5bf035ab-c3f4-4ce5-bd46-9b55d72879b7-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ndgrn\" (UID: \"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.185694 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.185650 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0970ce61-ac90-4b77-8b8b-3e0f732cce33-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-pwt5z\" (UID: \"0970ce61-ac90-4b77-8b8b-3e0f732cce33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:23.186789 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.186767 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-serving-cert\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.186994 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.186976 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-stats-auth\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.198971 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.198943 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2rfq\" (UniqueName: \"kubernetes.io/projected/0970ce61-ac90-4b77-8b8b-3e0f732cce33-kube-api-access-h2rfq\") pod \"kube-storage-version-migrator-operator-6769c5d45-pwt5z\" (UID: \"0970ce61-ac90-4b77-8b8b-3e0f732cce33\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:23.199155 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.199087 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjz6c\" (UniqueName: \"kubernetes.io/projected/210b0ad6-9a05-4f06-b86a-9dcf876a3ae5-kube-api-access-cjz6c\") pod \"router-default-75b9fd94fb-6dvbt\" (UID: \"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5\") " pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.199775 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.199749 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9plcc\" (UniqueName: \"kubernetes.io/projected/68dbc54c-11d7-4146-ac64-235b136dbc33-kube-api-access-9plcc\") pod \"insights-operator-585dfdc468-5c64q\" (UID: \"68dbc54c-11d7-4146-ac64-235b136dbc33\") " pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.199904 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.199850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4thj\" (UniqueName: \"kubernetes.io/projected/386a64c0-8bc9-4f79-b5ac-b26c8b1f4603-kube-api-access-g4thj\") pod \"console-operator-9d4b6777b-nskql\" (UID: \"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603\") " pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.201400 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.201379 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jd59\" (UniqueName: \"kubernetes.io/projected/5bf035ab-c3f4-4ce5-bd46-9b55d72879b7-kube-api-access-8jd59\") pod \"cluster-monitoring-operator-75587bd455-ndgrn\" (UID: \"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.260003 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.259971 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt" Apr 17 10:18:23.275967 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.275883 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-5c64q" Apr 17 10:18:23.293018 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.292985 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" Apr 17 10:18:23.303913 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.303883 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw"] Apr 17 10:18:23.306442 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.306379 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:23.317960 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.317926 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:23.359593 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.354919 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" Apr 17 10:18:23.466240 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.466203 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt"] Apr 17 10:18:23.470305 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.470255 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-5c64q"] Apr 17 10:18:23.540381 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.540245 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-nskql"] Apr 17 10:18:23.542725 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:23.542692 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod386a64c0_8bc9_4f79_b5ac_b26c8b1f4603.slice/crio-fc4c0c97f3a330572e9d5abb65a5811fb29786f9e843c66af7d23d8c35daf67c WatchSource:0}: Error finding container fc4c0c97f3a330572e9d5abb65a5811fb29786f9e843c66af7d23d8c35daf67c: Status 404 returned error can't find the container with id fc4c0c97f3a330572e9d5abb65a5811fb29786f9e843c66af7d23d8c35daf67c Apr 17 10:18:23.559005 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.558797 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-75b9fd94fb-6dvbt"] Apr 17 10:18:23.561013 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:23.560983 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod210b0ad6_9a05_4f06_b86a_9dcf876a3ae5.slice/crio-220b53fb0a35522eda18e5105391d767f65ef0cdf3e29af197d29b73929eea32 WatchSource:0}: Error finding container 220b53fb0a35522eda18e5105391d767f65ef0cdf3e29af197d29b73929eea32: Status 404 returned error can't find the container with id 220b53fb0a35522eda18e5105391d767f65ef0cdf3e29af197d29b73929eea32 Apr 17 10:18:23.573691 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.573665 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn"] Apr 17 10:18:23.583346 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:23.583268 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bf035ab_c3f4_4ce5_bd46_9b55d72879b7.slice/crio-d2913d94a9aeb9bfdb824165edb8936804ec6c819085ee0d4348fa9fb6ad30ce WatchSource:0}: Error finding container d2913d94a9aeb9bfdb824165edb8936804ec6c819085ee0d4348fa9fb6ad30ce: Status 404 returned error can't find the container with id d2913d94a9aeb9bfdb824165edb8936804ec6c819085ee0d4348fa9fb6ad30ce Apr 17 10:18:23.715479 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:23.715401 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z"] Apr 17 10:18:23.726021 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:23.725986 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0970ce61_ac90_4b77_8b8b_3e0f732cce33.slice/crio-8ac17bf8dcd64483754e06fb6064e326fb32500b9912f4d580d6249d5a32fd9b WatchSource:0}: Error finding container 8ac17bf8dcd64483754e06fb6064e326fb32500b9912f4d580d6249d5a32fd9b: Status 404 returned error can't find the container with id 8ac17bf8dcd64483754e06fb6064e326fb32500b9912f4d580d6249d5a32fd9b Apr 17 10:18:24.116836 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.116666 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" event={"ID":"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5","Type":"ContainerStarted","Data":"38f73870baebfac155786416a86c3a36fde24ce90c3df167f0d2a8df79829f2b"} Apr 17 10:18:24.116836 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.116803 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" event={"ID":"210b0ad6-9a05-4f06-b86a-9dcf876a3ae5","Type":"ContainerStarted","Data":"220b53fb0a35522eda18e5105391d767f65ef0cdf3e29af197d29b73929eea32"} Apr 17 10:18:24.118978 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.118903 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5c64q" event={"ID":"68dbc54c-11d7-4146-ac64-235b136dbc33","Type":"ContainerStarted","Data":"bf19aa9bb35d47cf13b4d54d71c6e39d433f5915d2b018642cb2a0f0bfc19d24"} Apr 17 10:18:24.120963 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.120922 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt" event={"ID":"60bcfb22-9d7f-4ed5-894f-3be5795bb21e","Type":"ContainerStarted","Data":"6c7f9691c1c95f259d7f4e086a9820ce466afd648b8d56e41eca80b94ab1eba3"} Apr 17 10:18:24.122583 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.122552 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" event={"ID":"0970ce61-ac90-4b77-8b8b-3e0f732cce33","Type":"ContainerStarted","Data":"8ac17bf8dcd64483754e06fb6064e326fb32500b9912f4d580d6249d5a32fd9b"} Apr 17 10:18:24.124663 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.124634 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" event={"ID":"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7","Type":"ContainerStarted","Data":"d2913d94a9aeb9bfdb824165edb8936804ec6c819085ee0d4348fa9fb6ad30ce"} Apr 17 10:18:24.126223 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.126177 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" event={"ID":"13e5ced9-25b8-4b3f-9a0d-ac11761a6140","Type":"ContainerStarted","Data":"c26bce1057466c24d6dc31096cb3ed1c2166a58bd42bda6fb31ba527ca044dc3"} Apr 17 10:18:24.128014 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.127978 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nskql" event={"ID":"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603","Type":"ContainerStarted","Data":"fc4c0c97f3a330572e9d5abb65a5811fb29786f9e843c66af7d23d8c35daf67c"} Apr 17 10:18:24.145925 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.145367 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" podStartSLOduration=2.145346445 podStartE2EDuration="2.145346445s" podCreationTimestamp="2026-04-17 10:18:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 10:18:24.144053728 +0000 UTC m=+63.880240082" watchObservedRunningTime="2026-04-17 10:18:24.145346445 +0000 UTC m=+63.881532782" Apr 17 10:18:24.319503 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.319445 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:24.323096 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.322866 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:24.492108 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.491356 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:18:24.513210 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.513178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ccb66962-e736-41cf-aff2-701b47e39579-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nn582\" (UID: \"ccb66962-e736-41cf-aff2-701b47e39579\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:18:24.756015 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.755626 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tplzz\"" Apr 17 10:18:24.764843 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:24.764323 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" Apr 17 10:18:25.135823 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:25.135779 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:25.138082 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:25.137855 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-75b9fd94fb-6dvbt" Apr 17 10:18:26.612269 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:26.612222 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:18:26.614636 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:26.614612 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/964ce349-102e-4852-b1d8-a4c58565a4a0-metrics-certs\") pod \"network-metrics-daemon-6q4fn\" (UID: \"964ce349-102e-4852-b1d8-a4c58565a4a0\") " pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:18:26.883304 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:26.883220 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-k4g8r\"" Apr 17 10:18:26.891295 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:26.891268 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6q4fn" Apr 17 10:18:28.024562 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:28.024534 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xzjpm" Apr 17 10:18:28.530297 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:28.530222 2579 patch_prober.go:28] interesting pod/image-registry-566b6b7754-2cpbf container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 10:18:28.530457 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:28.530282 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" podUID="fff2f824-dc09-4d90-817f-04bff462678a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 10:18:28.577696 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:28.577665 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6q4fn"] Apr 17 10:18:28.590468 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:28.590440 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod964ce349_102e_4852_b1d8_a4c58565a4a0.slice/crio-8f3714c1c02d8edb506feb3a2d6a30872fb1a9012267b33dea1c2a713da2880b WatchSource:0}: Error finding container 8f3714c1c02d8edb506feb3a2d6a30872fb1a9012267b33dea1c2a713da2880b: Status 404 returned error can't find the container with id 8f3714c1c02d8edb506feb3a2d6a30872fb1a9012267b33dea1c2a713da2880b Apr 17 10:18:28.597804 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:28.597694 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-nn582"] Apr 17 10:18:29.148851 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.148810 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-5c64q" event={"ID":"68dbc54c-11d7-4146-ac64-235b136dbc33","Type":"ContainerStarted","Data":"5db3396b839883b9094a5b21eefa2ebc94567613f621a7147eb2863d52780ae2"} Apr 17 10:18:29.155996 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.155944 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt" event={"ID":"60bcfb22-9d7f-4ed5-894f-3be5795bb21e","Type":"ContainerStarted","Data":"bc692355ba5a43df79f5c1d09358617ce217e529c5593847d7fa5b55eddcb9c2"} Apr 17 10:18:29.157907 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.157878 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" event={"ID":"0970ce61-ac90-4b77-8b8b-3e0f732cce33","Type":"ContainerStarted","Data":"bc4f146630bd3588ddb0117392e62e7e1bc9400d35fc2a04ce2431bd337f240e"} Apr 17 10:18:29.159536 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.159507 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6q4fn" event={"ID":"964ce349-102e-4852-b1d8-a4c58565a4a0","Type":"ContainerStarted","Data":"8f3714c1c02d8edb506feb3a2d6a30872fb1a9012267b33dea1c2a713da2880b"} Apr 17 10:18:29.161027 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.160983 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" event={"ID":"5bf035ab-c3f4-4ce5-bd46-9b55d72879b7","Type":"ContainerStarted","Data":"220cb5489d06c19cca8af346bd38d031a2f5597d2a52ec6d56cf165a5fa0dd71"} Apr 17 10:18:29.163467 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.162908 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" event={"ID":"13e5ced9-25b8-4b3f-9a0d-ac11761a6140","Type":"ContainerStarted","Data":"c37583c0ad7a29d79eee280ddbf1cec11ae596b94aa922524957041a1b997bbf"} Apr 17 10:18:29.163467 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.162941 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" event={"ID":"13e5ced9-25b8-4b3f-9a0d-ac11761a6140","Type":"ContainerStarted","Data":"eeea2023a98ae43301200a29be66b5fafff61894f108c18c19d6303f06155b87"} Apr 17 10:18:29.165926 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.164635 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" event={"ID":"ccb66962-e736-41cf-aff2-701b47e39579","Type":"ContainerStarted","Data":"1de60f892e35953e8abc7afc3849693c2ee20cf50a36c26d8fc1f8c1b6efa359"} Apr 17 10:18:29.166736 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.166665 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-nskql" event={"ID":"386a64c0-8bc9-4f79-b5ac-b26c8b1f4603","Type":"ContainerStarted","Data":"188dd260d4ef7740745b3b794785eaabd748a84f5302f8ad5dd2cdc712d35ea1"} Apr 17 10:18:29.169181 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.167382 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:29.170142 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.169727 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-5c64q" podStartSLOduration=2.233495982 podStartE2EDuration="7.16971165s" podCreationTimestamp="2026-04-17 10:18:22 +0000 UTC" firstStartedPulling="2026-04-17 10:18:23.483565615 +0000 UTC m=+63.219751950" lastFinishedPulling="2026-04-17 10:18:28.419781271 +0000 UTC m=+68.155967618" observedRunningTime="2026-04-17 10:18:29.165156412 +0000 UTC m=+68.901342780" watchObservedRunningTime="2026-04-17 10:18:29.16971165 +0000 UTC m=+68.905898005" Apr 17 10:18:29.213818 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.213741 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ndgrn" podStartSLOduration=1.378881417 podStartE2EDuration="6.213718916s" podCreationTimestamp="2026-04-17 10:18:23 +0000 UTC" firstStartedPulling="2026-04-17 10:18:23.586752317 +0000 UTC m=+63.322938648" lastFinishedPulling="2026-04-17 10:18:28.421589801 +0000 UTC m=+68.157776147" observedRunningTime="2026-04-17 10:18:29.182553164 +0000 UTC m=+68.918739517" watchObservedRunningTime="2026-04-17 10:18:29.213718916 +0000 UTC m=+68.949905331" Apr 17 10:18:29.242085 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.241871 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-ssrzt" podStartSLOduration=2.313979852 podStartE2EDuration="7.241853017s" podCreationTimestamp="2026-04-17 10:18:22 +0000 UTC" firstStartedPulling="2026-04-17 10:18:23.4835645 +0000 UTC m=+63.219750836" lastFinishedPulling="2026-04-17 10:18:28.411437656 +0000 UTC m=+68.147624001" observedRunningTime="2026-04-17 10:18:29.24043811 +0000 UTC m=+68.976624459" watchObservedRunningTime="2026-04-17 10:18:29.241853017 +0000 UTC m=+68.978039367" Apr 17 10:18:29.242899 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.242862 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-h4frw" podStartSLOduration=2.240374284 podStartE2EDuration="7.242851508s" podCreationTimestamp="2026-04-17 10:18:22 +0000 UTC" firstStartedPulling="2026-04-17 10:18:23.417124705 +0000 UTC m=+63.153311055" lastFinishedPulling="2026-04-17 10:18:28.419601947 +0000 UTC m=+68.155788279" observedRunningTime="2026-04-17 10:18:29.214330397 +0000 UTC m=+68.950516759" watchObservedRunningTime="2026-04-17 10:18:29.242851508 +0000 UTC m=+68.979037862" Apr 17 10:18:29.261272 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.261193 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-pwt5z" podStartSLOduration=2.563590154 podStartE2EDuration="7.26117382s" podCreationTimestamp="2026-04-17 10:18:22 +0000 UTC" firstStartedPulling="2026-04-17 10:18:23.727885316 +0000 UTC m=+63.464071650" lastFinishedPulling="2026-04-17 10:18:28.425468971 +0000 UTC m=+68.161655316" observedRunningTime="2026-04-17 10:18:29.258200444 +0000 UTC m=+68.994386797" watchObservedRunningTime="2026-04-17 10:18:29.26117382 +0000 UTC m=+68.997360173" Apr 17 10:18:29.279492 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.279425 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-nskql" podStartSLOduration=2.405578147 podStartE2EDuration="7.279379655s" podCreationTimestamp="2026-04-17 10:18:22 +0000 UTC" firstStartedPulling="2026-04-17 10:18:23.545455495 +0000 UTC m=+63.281641825" lastFinishedPulling="2026-04-17 10:18:28.419256986 +0000 UTC m=+68.155443333" observedRunningTime="2026-04-17 10:18:29.276294703 +0000 UTC m=+69.012481059" watchObservedRunningTime="2026-04-17 10:18:29.279379655 +0000 UTC m=+69.015566009" Apr 17 10:18:29.918741 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:29.918689 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-nskql" Apr 17 10:18:30.078508 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:30.078098 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:18:30.197486 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:30.197401 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-27dzv_156b0fef-9a3a-4de8-81ea-13ac2cb45c19/dns/0.log" Apr 17 10:18:30.381649 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:30.381619 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-27dzv_156b0fef-9a3a-4de8-81ea-13ac2cb45c19/kube-rbac-proxy/0.log" Apr 17 10:18:30.975834 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:30.975808 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6lwvd_730f5024-a732-4fc5-b09b-dab4adee9607/dns-node-resolver/0.log" Apr 17 10:18:31.175099 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:31.175056 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6q4fn" event={"ID":"964ce349-102e-4852-b1d8-a4c58565a4a0","Type":"ContainerStarted","Data":"573a23ca2c9a683ad55263fb8f32aa381abaa26eabe2708e32a3b2ab552cbeff"} Apr 17 10:18:31.175099 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:31.175097 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6q4fn" event={"ID":"964ce349-102e-4852-b1d8-a4c58565a4a0","Type":"ContainerStarted","Data":"f732e2d0131f70880f14854ecf854b82994eb94f4a6205c274e4cc1e0c7af164"} Apr 17 10:18:31.176677 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:31.176648 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" event={"ID":"ccb66962-e736-41cf-aff2-701b47e39579","Type":"ContainerStarted","Data":"e418b6be33a1a262a671fa10b2b7e5accf71053c556355c6cdc12d6b74f195a0"} Apr 17 10:18:31.190585 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:31.190536 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6q4fn" podStartSLOduration=69.538892506 podStartE2EDuration="1m11.190520832s" podCreationTimestamp="2026-04-17 10:17:20 +0000 UTC" firstStartedPulling="2026-04-17 10:18:28.596564566 +0000 UTC m=+68.332750915" lastFinishedPulling="2026-04-17 10:18:30.248192902 +0000 UTC m=+69.984379241" observedRunningTime="2026-04-17 10:18:31.190318485 +0000 UTC m=+70.926504838" watchObservedRunningTime="2026-04-17 10:18:31.190520832 +0000 UTC m=+70.926707185" Apr 17 10:18:31.205006 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:31.204952 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nn582" podStartSLOduration=41.574615752 podStartE2EDuration="43.204937875s" podCreationTimestamp="2026-04-17 10:17:48 +0000 UTC" firstStartedPulling="2026-04-17 10:18:28.615428065 +0000 UTC m=+68.351614412" lastFinishedPulling="2026-04-17 10:18:30.245750191 +0000 UTC m=+69.981936535" observedRunningTime="2026-04-17 10:18:31.203673342 +0000 UTC m=+70.939859695" watchObservedRunningTime="2026-04-17 10:18:31.204937875 +0000 UTC m=+70.941124228" Apr 17 10:18:31.576542 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:31.576510 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-566b6b7754-2cpbf_fff2f824-dc09-4d90-817f-04bff462678a/registry/0.log" Apr 17 10:18:32.176121 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:32.176089 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qlxxc_e8471c84-04de-44cd-a299-b4dc3d7003ab/node-ca/0.log" Apr 17 10:18:32.576271 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:32.576240 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-75b9fd94fb-6dvbt_210b0ad6-9a05-4f06-b86a-9dcf876a3ae5/router/0.log" Apr 17 10:18:32.776187 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:32.776158 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gbbkk_50cdff83-6754-4d62-8a0c-bfe31b13a7eb/serve-healthcheck-canary/0.log" Apr 17 10:18:37.678450 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.678404 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4kssn"] Apr 17 10:18:37.682948 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.682927 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.685404 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.685380 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 10:18:37.685538 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.685490 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7l8hg\"" Apr 17 10:18:37.685707 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.685685 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 10:18:37.693458 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.693432 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4kssn"] Apr 17 10:18:37.804246 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.804210 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b7940a9f-1624-49aa-8779-367a5cc2092e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.804246 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.804248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b7940a9f-1624-49aa-8779-367a5cc2092e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.804551 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.804270 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8mmk\" (UniqueName: \"kubernetes.io/projected/b7940a9f-1624-49aa-8779-367a5cc2092e-kube-api-access-l8mmk\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.804551 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.804334 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b7940a9f-1624-49aa-8779-367a5cc2092e-data-volume\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.804551 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.804488 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b7940a9f-1624-49aa-8779-367a5cc2092e-crio-socket\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.905616 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.905570 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b7940a9f-1624-49aa-8779-367a5cc2092e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.905765 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.905620 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b7940a9f-1624-49aa-8779-367a5cc2092e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.905855 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.905829 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8mmk\" (UniqueName: \"kubernetes.io/projected/b7940a9f-1624-49aa-8779-367a5cc2092e-kube-api-access-l8mmk\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.905993 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.905967 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b7940a9f-1624-49aa-8779-367a5cc2092e-data-volume\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.906053 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.906025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b7940a9f-1624-49aa-8779-367a5cc2092e-crio-socket\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.906167 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.906149 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b7940a9f-1624-49aa-8779-367a5cc2092e-crio-socket\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.906167 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.906159 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b7940a9f-1624-49aa-8779-367a5cc2092e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.906254 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.906216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b7940a9f-1624-49aa-8779-367a5cc2092e-data-volume\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.908486 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.908459 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b7940a9f-1624-49aa-8779-367a5cc2092e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.914189 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.914164 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8mmk\" (UniqueName: \"kubernetes.io/projected/b7940a9f-1624-49aa-8779-367a5cc2092e-kube-api-access-l8mmk\") pod \"insights-runtime-extractor-4kssn\" (UID: \"b7940a9f-1624-49aa-8779-367a5cc2092e\") " pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:37.993465 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:37.993360 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4kssn" Apr 17 10:18:38.118288 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:38.118256 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4kssn"] Apr 17 10:18:38.121135 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:38.121106 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7940a9f_1624_49aa_8779_367a5cc2092e.slice/crio-84ef6add4abce0f304695e2e4bec690d088d72b31918c81a31593ca6842c01a7 WatchSource:0}: Error finding container 84ef6add4abce0f304695e2e4bec690d088d72b31918c81a31593ca6842c01a7: Status 404 returned error can't find the container with id 84ef6add4abce0f304695e2e4bec690d088d72b31918c81a31593ca6842c01a7 Apr 17 10:18:38.198841 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:38.198787 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4kssn" event={"ID":"b7940a9f-1624-49aa-8779-367a5cc2092e","Type":"ContainerStarted","Data":"42959d9f1f9e57bc21a2fb24b0f512821011973760c8c28eadf452d137992358"} Apr 17 10:18:38.198841 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:38.198833 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4kssn" event={"ID":"b7940a9f-1624-49aa-8779-367a5cc2092e","Type":"ContainerStarted","Data":"84ef6add4abce0f304695e2e4bec690d088d72b31918c81a31593ca6842c01a7"} Apr 17 10:18:39.203332 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:39.203292 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4kssn" event={"ID":"b7940a9f-1624-49aa-8779-367a5cc2092e","Type":"ContainerStarted","Data":"a77db96e562964d111c352d8ee71823d912db0e4ee93c2e0bff65d16d4bc09b6"} Apr 17 10:18:40.207659 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:40.207611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4kssn" event={"ID":"b7940a9f-1624-49aa-8779-367a5cc2092e","Type":"ContainerStarted","Data":"42abeb64a07773485c9f4ab665d879f70f083364e9795ed1c7e1c056681c16e9"} Apr 17 10:18:40.225858 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:40.225790 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4kssn" podStartSLOduration=1.313214166 podStartE2EDuration="3.225769435s" podCreationTimestamp="2026-04-17 10:18:37 +0000 UTC" firstStartedPulling="2026-04-17 10:18:38.184245324 +0000 UTC m=+77.920431655" lastFinishedPulling="2026-04-17 10:18:40.096800593 +0000 UTC m=+79.832986924" observedRunningTime="2026-04-17 10:18:40.225482815 +0000 UTC m=+79.961669171" watchObservedRunningTime="2026-04-17 10:18:40.225769435 +0000 UTC m=+79.961955789" Apr 17 10:18:43.524621 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.524586 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-j2j56"] Apr 17 10:18:43.530285 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.530259 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.533198 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.533173 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 10:18:43.533848 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.533826 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 10:18:43.533997 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.533913 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qk24s\"" Apr 17 10:18:43.534077 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.533931 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 10:18:43.534143 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.533951 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 10:18:43.660103 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.660061 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-textfile\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.660103 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.660108 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgs7d\" (UniqueName: \"kubernetes.io/projected/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-kube-api-access-xgs7d\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.660474 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.660137 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-accelerators-collector-config\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.660474 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.660168 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-wtmp\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.660474 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.660190 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-metrics-client-ca\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.660474 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.660280 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-root\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.660474 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.660332 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-tls\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.660474 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.660384 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.660474 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.660429 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-sys\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.761494 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-textfile\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.761705 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761512 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgs7d\" (UniqueName: \"kubernetes.io/projected/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-kube-api-access-xgs7d\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.761705 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-accelerators-collector-config\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.761705 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761577 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-wtmp\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.761705 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-metrics-client-ca\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.761705 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761652 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-root\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.761705 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761682 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-tls\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.762007 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761736 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.762007 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-sys\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.762007 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761837 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-textfile\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.762007 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.761866 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-sys\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.762190 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.762030 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-root\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.762190 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.762101 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-wtmp\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.762431 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.762383 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-metrics-client-ca\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.762742 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.762716 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-accelerators-collector-config\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.764250 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.764229 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.764364 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.764344 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-node-exporter-tls\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.769989 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.769962 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgs7d\" (UniqueName: \"kubernetes.io/projected/ebe2c4a6-9f94-45e4-9d13-c92fb307797f-kube-api-access-xgs7d\") pod \"node-exporter-j2j56\" (UID: \"ebe2c4a6-9f94-45e4-9d13-c92fb307797f\") " pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.841606 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:43.841521 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-j2j56" Apr 17 10:18:43.852379 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:43.852341 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebe2c4a6_9f94_45e4_9d13_c92fb307797f.slice/crio-2b0c86342a1bdde09b3d97f013ce456c8f3d5a58cb363189d818b929dae62cbb WatchSource:0}: Error finding container 2b0c86342a1bdde09b3d97f013ce456c8f3d5a58cb363189d818b929dae62cbb: Status 404 returned error can't find the container with id 2b0c86342a1bdde09b3d97f013ce456c8f3d5a58cb363189d818b929dae62cbb Apr 17 10:18:44.220677 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:44.220583 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j2j56" event={"ID":"ebe2c4a6-9f94-45e4-9d13-c92fb307797f","Type":"ContainerStarted","Data":"2b0c86342a1bdde09b3d97f013ce456c8f3d5a58cb363189d818b929dae62cbb"} Apr 17 10:18:45.224519 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.224482 2579 generic.go:358] "Generic (PLEG): container finished" podID="ebe2c4a6-9f94-45e4-9d13-c92fb307797f" containerID="936b573fd3c3c124a2bf26b59d6936769d9cc84ffec2974472cf3e4d2016bd16" exitCode=0 Apr 17 10:18:45.224909 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.224564 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j2j56" event={"ID":"ebe2c4a6-9f94-45e4-9d13-c92fb307797f","Type":"ContainerDied","Data":"936b573fd3c3c124a2bf26b59d6936769d9cc84ffec2974472cf3e4d2016bd16"} Apr 17 10:18:45.569880 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.569842 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w"] Apr 17 10:18:45.573572 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.573542 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.575971 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.575941 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 10:18:45.575971 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.575941 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 10:18:45.576140 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.575953 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 10:18:45.576140 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.576037 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 10:18:45.576140 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.576055 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-fh7ss\"" Apr 17 10:18:45.576301 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.576283 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 10:18:45.576387 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.576365 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-d5k20l85e96io\"" Apr 17 10:18:45.583533 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.583506 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w"] Apr 17 10:18:45.677560 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.677518 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.677767 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.677575 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.677767 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.677670 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.677767 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.677713 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-grpc-tls\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.677913 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.677749 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.677913 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.677802 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-tls\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.677913 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.677885 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4908b461-b08b-4871-9ad8-b70af0813818-metrics-client-ca\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.678026 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.677916 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dc8h\" (UniqueName: \"kubernetes.io/projected/4908b461-b08b-4871-9ad8-b70af0813818-kube-api-access-5dc8h\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.778683 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.778636 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dc8h\" (UniqueName: \"kubernetes.io/projected/4908b461-b08b-4871-9ad8-b70af0813818-kube-api-access-5dc8h\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.778861 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.778706 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.778861 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.778756 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.778861 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.778794 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.778861 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.778822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-grpc-tls\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.779083 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.778948 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.779083 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.778997 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-tls\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.779083 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.779041 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4908b461-b08b-4871-9ad8-b70af0813818-metrics-client-ca\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.779874 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.779824 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4908b461-b08b-4871-9ad8-b70af0813818-metrics-client-ca\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.781580 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.781550 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.781706 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.781684 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-tls\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.781800 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.781775 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.781885 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.781867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.782030 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.782015 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.782072 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.782022 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4908b461-b08b-4871-9ad8-b70af0813818-secret-grpc-tls\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.787039 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.787014 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dc8h\" (UniqueName: \"kubernetes.io/projected/4908b461-b08b-4871-9ad8-b70af0813818-kube-api-access-5dc8h\") pod \"thanos-querier-6d6dc7bc7f-5np9w\" (UID: \"4908b461-b08b-4871-9ad8-b70af0813818\") " pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:45.884620 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:45.884518 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:46.012008 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:46.011968 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w"] Apr 17 10:18:46.016826 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:46.016787 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4908b461_b08b_4871_9ad8_b70af0813818.slice/crio-c8c834d7e55b9b569c2aea933c54043ed7234a27b4e9eb696f9c9c29945d00c1 WatchSource:0}: Error finding container c8c834d7e55b9b569c2aea933c54043ed7234a27b4e9eb696f9c9c29945d00c1: Status 404 returned error can't find the container with id c8c834d7e55b9b569c2aea933c54043ed7234a27b4e9eb696f9c9c29945d00c1 Apr 17 10:18:46.229372 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:46.229270 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j2j56" event={"ID":"ebe2c4a6-9f94-45e4-9d13-c92fb307797f","Type":"ContainerStarted","Data":"aa7b28ca7611a7b515ab2f22ac567fa616f55a0efec3ee6cd02971dc52fae4ab"} Apr 17 10:18:46.229372 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:46.229312 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-j2j56" event={"ID":"ebe2c4a6-9f94-45e4-9d13-c92fb307797f","Type":"ContainerStarted","Data":"fb520d211a7e39b3c8edc4a82df5bace0c2f8091d5a381a11193042d6552208c"} Apr 17 10:18:46.230329 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:46.230304 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" event={"ID":"4908b461-b08b-4871-9ad8-b70af0813818","Type":"ContainerStarted","Data":"c8c834d7e55b9b569c2aea933c54043ed7234a27b4e9eb696f9c9c29945d00c1"} Apr 17 10:18:46.247083 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:46.247033 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-j2j56" podStartSLOduration=2.430466416 podStartE2EDuration="3.247019623s" podCreationTimestamp="2026-04-17 10:18:43 +0000 UTC" firstStartedPulling="2026-04-17 10:18:43.854470466 +0000 UTC m=+83.590656796" lastFinishedPulling="2026-04-17 10:18:44.671023658 +0000 UTC m=+84.407210003" observedRunningTime="2026-04-17 10:18:46.246352421 +0000 UTC m=+85.982538774" watchObservedRunningTime="2026-04-17 10:18:46.247019623 +0000 UTC m=+85.983205975" Apr 17 10:18:47.897930 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.897892 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6f79cf884b-wdz2t"] Apr 17 10:18:47.902385 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.902360 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:47.904720 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.904697 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 10:18:47.905587 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.905564 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-x88z2\"" Apr 17 10:18:47.905701 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.905584 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 10:18:47.905701 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.905572 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 10:18:47.905701 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.905589 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-d95d1vf8obn9e\"" Apr 17 10:18:47.905851 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.905570 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 10:18:47.911564 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.911528 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f79cf884b-wdz2t"] Apr 17 10:18:47.998938 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.998901 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/30850014-6153-4b95-b74e-6db1f8fa25e4-secret-metrics-server-client-certs\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:47.999097 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.998958 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/30850014-6153-4b95-b74e-6db1f8fa25e4-secret-metrics-server-tls\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:47.999097 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.999000 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6n5j\" (UniqueName: \"kubernetes.io/projected/30850014-6153-4b95-b74e-6db1f8fa25e4-kube-api-access-d6n5j\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:47.999097 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.999036 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/30850014-6153-4b95-b74e-6db1f8fa25e4-audit-log\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:47.999097 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.999052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/30850014-6153-4b95-b74e-6db1f8fa25e4-metrics-server-audit-profiles\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:47.999097 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.999097 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30850014-6153-4b95-b74e-6db1f8fa25e4-client-ca-bundle\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:47.999390 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:47.999130 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30850014-6153-4b95-b74e-6db1f8fa25e4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.099590 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.099550 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30850014-6153-4b95-b74e-6db1f8fa25e4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.099748 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.099614 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/30850014-6153-4b95-b74e-6db1f8fa25e4-secret-metrics-server-client-certs\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.099748 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.099647 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/30850014-6153-4b95-b74e-6db1f8fa25e4-secret-metrics-server-tls\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.099748 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.099666 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6n5j\" (UniqueName: \"kubernetes.io/projected/30850014-6153-4b95-b74e-6db1f8fa25e4-kube-api-access-d6n5j\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.099748 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.099698 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/30850014-6153-4b95-b74e-6db1f8fa25e4-audit-log\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.099748 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.099723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/30850014-6153-4b95-b74e-6db1f8fa25e4-metrics-server-audit-profiles\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.099999 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.099757 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30850014-6153-4b95-b74e-6db1f8fa25e4-client-ca-bundle\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.100180 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.100118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/30850014-6153-4b95-b74e-6db1f8fa25e4-audit-log\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.100362 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.100340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30850014-6153-4b95-b74e-6db1f8fa25e4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.101320 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.101289 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/30850014-6153-4b95-b74e-6db1f8fa25e4-metrics-server-audit-profiles\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.102237 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.102212 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/30850014-6153-4b95-b74e-6db1f8fa25e4-secret-metrics-server-tls\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.102410 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.102393 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/30850014-6153-4b95-b74e-6db1f8fa25e4-secret-metrics-server-client-certs\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.102507 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.102493 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30850014-6153-4b95-b74e-6db1f8fa25e4-client-ca-bundle\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.108236 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.108214 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6n5j\" (UniqueName: \"kubernetes.io/projected/30850014-6153-4b95-b74e-6db1f8fa25e4-kube-api-access-d6n5j\") pod \"metrics-server-6f79cf884b-wdz2t\" (UID: \"30850014-6153-4b95-b74e-6db1f8fa25e4\") " pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.216402 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.216360 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:18:48.259230 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.259169 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" event={"ID":"4908b461-b08b-4871-9ad8-b70af0813818","Type":"ContainerStarted","Data":"6c4110455fbe95c3a756a1c42b87aaedcbccbda3066704fae1bca694a8027869"} Apr 17 10:18:48.370329 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.370188 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6f79cf884b-wdz2t"] Apr 17 10:18:48.373567 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:48.373471 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30850014_6153_4b95_b74e_6db1f8fa25e4.slice/crio-cb0ebeb71e79696d4484780147c952db3c79cd1fa37aecf8d7508044f0e54789 WatchSource:0}: Error finding container cb0ebeb71e79696d4484780147c952db3c79cd1fa37aecf8d7508044f0e54789: Status 404 returned error can't find the container with id cb0ebeb71e79696d4484780147c952db3c79cd1fa37aecf8d7508044f0e54789 Apr 17 10:18:48.720868 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.720825 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-bdf54899-tlqjf"] Apr 17 10:18:48.724384 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.724358 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.726923 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.726897 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 10:18:48.727258 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.727231 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 10:18:48.727382 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.727274 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 10:18:48.727382 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.727316 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 10:18:48.727382 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.727281 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 10:18:48.727382 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.727274 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-f4c7v\"" Apr 17 10:18:48.732684 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.732663 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 10:18:48.738922 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.738896 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-bdf54899-tlqjf"] Apr 17 10:18:48.807888 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.807846 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-secret-telemeter-client\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.808084 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.807908 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc6b947-b1ac-4dd3-ad0d-963849a98169-serving-certs-ca-bundle\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.808084 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.808011 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-telemeter-client-tls\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.808084 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.808056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g75g\" (UniqueName: \"kubernetes.io/projected/8fc6b947-b1ac-4dd3-ad0d-963849a98169-kube-api-access-5g75g\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.808204 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.808139 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8fc6b947-b1ac-4dd3-ad0d-963849a98169-metrics-client-ca\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.808243 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.808203 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.808275 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.808246 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc6b947-b1ac-4dd3-ad0d-963849a98169-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.808325 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.808279 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-federate-client-tls\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.909744 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.909694 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-secret-telemeter-client\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.910250 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.909768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc6b947-b1ac-4dd3-ad0d-963849a98169-serving-certs-ca-bundle\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.910250 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.909813 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-telemeter-client-tls\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.910250 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.909848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g75g\" (UniqueName: \"kubernetes.io/projected/8fc6b947-b1ac-4dd3-ad0d-963849a98169-kube-api-access-5g75g\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.910250 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.909883 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8fc6b947-b1ac-4dd3-ad0d-963849a98169-metrics-client-ca\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.910250 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.909930 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.910250 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.909988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc6b947-b1ac-4dd3-ad0d-963849a98169-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.910250 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.910031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-federate-client-tls\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.911401 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.911343 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc6b947-b1ac-4dd3-ad0d-963849a98169-serving-certs-ca-bundle\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.911711 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.911538 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8fc6b947-b1ac-4dd3-ad0d-963849a98169-metrics-client-ca\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.911993 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.911966 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8fc6b947-b1ac-4dd3-ad0d-963849a98169-telemeter-trusted-ca-bundle\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.913555 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.913510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.913856 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.913832 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-federate-client-tls\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.914532 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.914510 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-telemeter-client-tls\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.914986 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.914960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/8fc6b947-b1ac-4dd3-ad0d-963849a98169-secret-telemeter-client\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:48.925441 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:48.925392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g75g\" (UniqueName: \"kubernetes.io/projected/8fc6b947-b1ac-4dd3-ad0d-963849a98169-kube-api-access-5g75g\") pod \"telemeter-client-bdf54899-tlqjf\" (UID: \"8fc6b947-b1ac-4dd3-ad0d-963849a98169\") " pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:49.034759 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:49.034692 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" Apr 17 10:18:49.263150 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:49.263119 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" event={"ID":"30850014-6153-4b95-b74e-6db1f8fa25e4","Type":"ContainerStarted","Data":"cb0ebeb71e79696d4484780147c952db3c79cd1fa37aecf8d7508044f0e54789"} Apr 17 10:18:49.265057 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:49.265023 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" event={"ID":"4908b461-b08b-4871-9ad8-b70af0813818","Type":"ContainerStarted","Data":"209f93e3138284b899ff66238a68fb0f69db537dafd66b2a62fd22237da0a884"} Apr 17 10:18:49.265171 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:49.265065 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" event={"ID":"4908b461-b08b-4871-9ad8-b70af0813818","Type":"ContainerStarted","Data":"1fc68b2002804433dca5f70faccb196cef2fb3d34becc601a586669691caf6c0"} Apr 17 10:18:49.329117 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:49.329078 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-bdf54899-tlqjf"] Apr 17 10:18:49.644500 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:49.644471 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fc6b947_b1ac_4dd3_ad0d_963849a98169.slice/crio-faf975e657c8b8a9192185411d567313fdfffd450022948153d92df58858f40e WatchSource:0}: Error finding container faf975e657c8b8a9192185411d567313fdfffd450022948153d92df58858f40e: Status 404 returned error can't find the container with id faf975e657c8b8a9192185411d567313fdfffd450022948153d92df58858f40e Apr 17 10:18:50.272748 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:50.272660 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" event={"ID":"4908b461-b08b-4871-9ad8-b70af0813818","Type":"ContainerStarted","Data":"b99b0561fbcbe8123944603c9e5ec1e4c7f11e4e9e8df9fa3bc9978bbf111316"} Apr 17 10:18:50.272748 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:50.272703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" event={"ID":"4908b461-b08b-4871-9ad8-b70af0813818","Type":"ContainerStarted","Data":"65c04d0fe68d7ae0f8dd591f6596b7fa6629044c0cc59257642b9aea1c9f0cc8"} Apr 17 10:18:50.272748 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:50.272720 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" event={"ID":"4908b461-b08b-4871-9ad8-b70af0813818","Type":"ContainerStarted","Data":"ce9c6a48e674cc8b2aee252b66fb25c1eb2944b9b93c4cf839703f60b07cac2c"} Apr 17 10:18:50.273312 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:50.272838 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:18:50.273954 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:50.273926 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" event={"ID":"8fc6b947-b1ac-4dd3-ad0d-963849a98169","Type":"ContainerStarted","Data":"faf975e657c8b8a9192185411d567313fdfffd450022948153d92df58858f40e"} Apr 17 10:18:50.275644 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:50.275611 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" event={"ID":"30850014-6153-4b95-b74e-6db1f8fa25e4","Type":"ContainerStarted","Data":"f63d2b4bc1655022d773b71ae8005c44f691c50ee672d166682a3505ce74251d"} Apr 17 10:18:50.293269 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:50.293202 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" podStartSLOduration=2.08519261 podStartE2EDuration="5.293185629s" podCreationTimestamp="2026-04-17 10:18:45 +0000 UTC" firstStartedPulling="2026-04-17 10:18:46.018788893 +0000 UTC m=+85.754975223" lastFinishedPulling="2026-04-17 10:18:49.226781888 +0000 UTC m=+88.962968242" observedRunningTime="2026-04-17 10:18:50.291912687 +0000 UTC m=+90.028099043" watchObservedRunningTime="2026-04-17 10:18:50.293185629 +0000 UTC m=+90.029371983" Apr 17 10:18:50.308713 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:50.308659 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" podStartSLOduration=1.992299206 podStartE2EDuration="3.308646042s" podCreationTimestamp="2026-04-17 10:18:47 +0000 UTC" firstStartedPulling="2026-04-17 10:18:48.37575086 +0000 UTC m=+88.111937191" lastFinishedPulling="2026-04-17 10:18:49.692097692 +0000 UTC m=+89.428284027" observedRunningTime="2026-04-17 10:18:50.307315742 +0000 UTC m=+90.043502106" watchObservedRunningTime="2026-04-17 10:18:50.308646042 +0000 UTC m=+90.044832392" Apr 17 10:18:51.057151 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.057111 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fdc7f988b-2867z"] Apr 17 10:18:51.060732 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.060708 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.063223 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.063195 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4b9bn\"" Apr 17 10:18:51.063345 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.063228 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 10:18:51.063345 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.063250 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 10:18:51.063345 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.063274 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 10:18:51.063530 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.063502 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 10:18:51.063623 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.063605 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 10:18:51.063679 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.063626 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 10:18:51.063679 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.063655 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 10:18:51.068369 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.068349 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 10:18:51.071951 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.071930 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fdc7f988b-2867z"] Apr 17 10:18:51.232792 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.232755 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-serving-cert\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.232792 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.232791 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-oauth-serving-cert\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.232998 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.232862 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g5wr\" (UniqueName: \"kubernetes.io/projected/45889600-158b-4b62-912f-d6d90cd827d4-kube-api-access-8g5wr\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.232998 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.232922 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-oauth-config\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.232998 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.232948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-service-ca\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.232998 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.232967 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-console-config\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.233121 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.233042 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-trusted-ca-bundle\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.334815 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.334692 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8g5wr\" (UniqueName: \"kubernetes.io/projected/45889600-158b-4b62-912f-d6d90cd827d4-kube-api-access-8g5wr\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.335273 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.334852 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-oauth-config\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.335273 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.334891 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-service-ca\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.335273 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.334918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-console-config\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.335273 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.334988 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-trusted-ca-bundle\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.335273 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.335063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-serving-cert\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.335273 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.335089 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-oauth-serving-cert\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.336075 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.336046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-service-ca\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.336191 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.336046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-console-config\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.336191 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.336176 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-oauth-serving-cert\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.336309 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.336243 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-trusted-ca-bundle\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.338296 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.338273 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-serving-cert\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.338433 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.338280 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-oauth-config\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.343605 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.343583 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g5wr\" (UniqueName: \"kubernetes.io/projected/45889600-158b-4b62-912f-d6d90cd827d4-kube-api-access-8g5wr\") pod \"console-5fdc7f988b-2867z\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.372542 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.372512 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:18:51.715229 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.715140 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-566b6b7754-2cpbf"] Apr 17 10:18:51.935171 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:51.935140 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fdc7f988b-2867z"] Apr 17 10:18:51.938982 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:18:51.938954 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45889600_158b_4b62_912f_d6d90cd827d4.slice/crio-dee2d30572f3f3e76b23d373003e88bae54e2ba81fcc9db2f15ba50a6bd89699 WatchSource:0}: Error finding container dee2d30572f3f3e76b23d373003e88bae54e2ba81fcc9db2f15ba50a6bd89699: Status 404 returned error can't find the container with id dee2d30572f3f3e76b23d373003e88bae54e2ba81fcc9db2f15ba50a6bd89699 Apr 17 10:18:52.282626 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:52.282558 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdc7f988b-2867z" event={"ID":"45889600-158b-4b62-912f-d6d90cd827d4","Type":"ContainerStarted","Data":"dee2d30572f3f3e76b23d373003e88bae54e2ba81fcc9db2f15ba50a6bd89699"} Apr 17 10:18:52.284012 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:52.283984 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" event={"ID":"8fc6b947-b1ac-4dd3-ad0d-963849a98169","Type":"ContainerStarted","Data":"c4eff38f72d949b7a02c239a37f26eb31d693dd92c01b607ffd1f32e60eef959"} Apr 17 10:18:53.289228 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:53.289180 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" event={"ID":"8fc6b947-b1ac-4dd3-ad0d-963849a98169","Type":"ContainerStarted","Data":"c0832720fcd65f1d5fbb5c91c96095d90cda1d9bb51ae88a07616c4f9f4165f0"} Apr 17 10:18:53.289228 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:53.289231 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" event={"ID":"8fc6b947-b1ac-4dd3-ad0d-963849a98169","Type":"ContainerStarted","Data":"74825c9c9519a70c4d8823b3406cae472bea622c17dc0f3de0692bd764bc9b3c"} Apr 17 10:18:53.310607 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:53.310547 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-bdf54899-tlqjf" podStartSLOduration=2.090795976 podStartE2EDuration="5.310530566s" podCreationTimestamp="2026-04-17 10:18:48 +0000 UTC" firstStartedPulling="2026-04-17 10:18:49.647400388 +0000 UTC m=+89.383586724" lastFinishedPulling="2026-04-17 10:18:52.867134978 +0000 UTC m=+92.603321314" observedRunningTime="2026-04-17 10:18:53.309638756 +0000 UTC m=+93.045825114" watchObservedRunningTime="2026-04-17 10:18:53.310530566 +0000 UTC m=+93.046716932" Apr 17 10:18:55.298058 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:55.298018 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdc7f988b-2867z" event={"ID":"45889600-158b-4b62-912f-d6d90cd827d4","Type":"ContainerStarted","Data":"33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9"} Apr 17 10:18:55.324830 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:55.324773 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fdc7f988b-2867z" podStartSLOduration=1.517557857 podStartE2EDuration="4.324757463s" podCreationTimestamp="2026-04-17 10:18:51 +0000 UTC" firstStartedPulling="2026-04-17 10:18:51.942705668 +0000 UTC m=+91.678892000" lastFinishedPulling="2026-04-17 10:18:54.74990527 +0000 UTC m=+94.486091606" observedRunningTime="2026-04-17 10:18:55.323845456 +0000 UTC m=+95.060031808" watchObservedRunningTime="2026-04-17 10:18:55.324757463 +0000 UTC m=+95.060943815" Apr 17 10:18:55.995107 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:55.995071 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fdc7f988b-2867z"] Apr 17 10:18:56.284757 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:18:56.284726 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6d6dc7bc7f-5np9w" Apr 17 10:19:01.373156 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:01.373102 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:19:08.217307 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:08.217266 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:19:08.217307 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:08.217313 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:19:16.736645 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:16.736581 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" podUID="fff2f824-dc09-4d90-817f-04bff462678a" containerName="registry" containerID="cri-o://a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca" gracePeriod=30 Apr 17 10:19:16.993033 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:16.992964 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:19:17.061897 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.061848 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-installation-pull-secrets\") pod \"fff2f824-dc09-4d90-817f-04bff462678a\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " Apr 17 10:19:17.062088 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.061945 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-image-registry-private-configuration\") pod \"fff2f824-dc09-4d90-817f-04bff462678a\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " Apr 17 10:19:17.062088 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.062001 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gll59\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-kube-api-access-gll59\") pod \"fff2f824-dc09-4d90-817f-04bff462678a\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " Apr 17 10:19:17.062088 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.062021 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fff2f824-dc09-4d90-817f-04bff462678a-ca-trust-extracted\") pod \"fff2f824-dc09-4d90-817f-04bff462678a\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " Apr 17 10:19:17.062088 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.062057 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-registry-certificates\") pod \"fff2f824-dc09-4d90-817f-04bff462678a\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " Apr 17 10:19:17.062088 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.062081 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls\") pod \"fff2f824-dc09-4d90-817f-04bff462678a\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " Apr 17 10:19:17.062345 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.062107 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-trusted-ca\") pod \"fff2f824-dc09-4d90-817f-04bff462678a\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " Apr 17 10:19:17.062345 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.062138 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-bound-sa-token\") pod \"fff2f824-dc09-4d90-817f-04bff462678a\" (UID: \"fff2f824-dc09-4d90-817f-04bff462678a\") " Apr 17 10:19:17.062684 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.062624 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fff2f824-dc09-4d90-817f-04bff462678a" (UID: "fff2f824-dc09-4d90-817f-04bff462678a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 10:19:17.062974 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.062942 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fff2f824-dc09-4d90-817f-04bff462678a" (UID: "fff2f824-dc09-4d90-817f-04bff462678a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 10:19:17.067029 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.066994 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fff2f824-dc09-4d90-817f-04bff462678a" (UID: "fff2f824-dc09-4d90-817f-04bff462678a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 10:19:17.067029 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.067022 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-kube-api-access-gll59" (OuterVolumeSpecName: "kube-api-access-gll59") pod "fff2f824-dc09-4d90-817f-04bff462678a" (UID: "fff2f824-dc09-4d90-817f-04bff462678a"). InnerVolumeSpecName "kube-api-access-gll59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 10:19:17.067174 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.066999 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fff2f824-dc09-4d90-817f-04bff462678a" (UID: "fff2f824-dc09-4d90-817f-04bff462678a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 10:19:17.067174 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.067110 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fff2f824-dc09-4d90-817f-04bff462678a" (UID: "fff2f824-dc09-4d90-817f-04bff462678a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 10:19:17.067174 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.067111 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "fff2f824-dc09-4d90-817f-04bff462678a" (UID: "fff2f824-dc09-4d90-817f-04bff462678a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 10:19:17.071182 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.071149 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff2f824-dc09-4d90-817f-04bff462678a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fff2f824-dc09-4d90-817f-04bff462678a" (UID: "fff2f824-dc09-4d90-817f-04bff462678a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 10:19:17.162830 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.162786 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-trusted-ca\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:17.162830 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.162825 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-bound-sa-token\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:17.162830 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.162835 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-installation-pull-secrets\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:17.163067 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.162847 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fff2f824-dc09-4d90-817f-04bff462678a-image-registry-private-configuration\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:17.163067 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.162857 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gll59\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-kube-api-access-gll59\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:17.163067 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.162867 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fff2f824-dc09-4d90-817f-04bff462678a-ca-trust-extracted\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:17.163067 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.162876 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fff2f824-dc09-4d90-817f-04bff462678a-registry-certificates\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:17.163067 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.162884 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff2f824-dc09-4d90-817f-04bff462678a-registry-tls\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:17.361459 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.361400 2579 generic.go:358] "Generic (PLEG): container finished" podID="fff2f824-dc09-4d90-817f-04bff462678a" containerID="a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca" exitCode=0 Apr 17 10:19:17.361657 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.361486 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" Apr 17 10:19:17.361657 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.361493 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" event={"ID":"fff2f824-dc09-4d90-817f-04bff462678a","Type":"ContainerDied","Data":"a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca"} Apr 17 10:19:17.361657 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.361539 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-566b6b7754-2cpbf" event={"ID":"fff2f824-dc09-4d90-817f-04bff462678a","Type":"ContainerDied","Data":"17f0a29dae2b51d3727d7f59c8f4f972e93a7b5d63b127cdaf341367a90fe9d3"} Apr 17 10:19:17.361657 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.361561 2579 scope.go:117] "RemoveContainer" containerID="a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca" Apr 17 10:19:17.370168 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.370144 2579 scope.go:117] "RemoveContainer" containerID="a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca" Apr 17 10:19:17.370452 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:19:17.370404 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca\": container with ID starting with a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca not found: ID does not exist" containerID="a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca" Apr 17 10:19:17.370538 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.370464 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca"} err="failed to get container status \"a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca\": rpc error: code = NotFound desc = could not find container \"a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca\": container with ID starting with a8c25b3bb3eafc8e9457f8ca61cf7462162bff5f25046593279a71840b4fc4ca not found: ID does not exist" Apr 17 10:19:17.382593 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.382559 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-566b6b7754-2cpbf"] Apr 17 10:19:17.386787 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:17.386764 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-566b6b7754-2cpbf"] Apr 17 10:19:18.868148 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:18.868117 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff2f824-dc09-4d90-817f-04bff462678a" path="/var/lib/kubelet/pods/fff2f824-dc09-4d90-817f-04bff462678a/volumes" Apr 17 10:19:22.322560 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.322516 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fdc7f988b-2867z" podUID="45889600-158b-4b62-912f-d6d90cd827d4" containerName="console" containerID="cri-o://33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9" gracePeriod=15 Apr 17 10:19:22.561521 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.561497 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fdc7f988b-2867z_45889600-158b-4b62-912f-d6d90cd827d4/console/0.log" Apr 17 10:19:22.561654 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.561559 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:19:22.610641 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.610558 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-trusted-ca-bundle\") pod \"45889600-158b-4b62-912f-d6d90cd827d4\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " Apr 17 10:19:22.610641 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.610612 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g5wr\" (UniqueName: \"kubernetes.io/projected/45889600-158b-4b62-912f-d6d90cd827d4-kube-api-access-8g5wr\") pod \"45889600-158b-4b62-912f-d6d90cd827d4\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " Apr 17 10:19:22.610855 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.610648 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-oauth-serving-cert\") pod \"45889600-158b-4b62-912f-d6d90cd827d4\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " Apr 17 10:19:22.610855 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.610750 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-oauth-config\") pod \"45889600-158b-4b62-912f-d6d90cd827d4\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " Apr 17 10:19:22.610855 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.610799 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-serving-cert\") pod \"45889600-158b-4b62-912f-d6d90cd827d4\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " Apr 17 10:19:22.610855 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.610839 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-service-ca\") pod \"45889600-158b-4b62-912f-d6d90cd827d4\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " Apr 17 10:19:22.611061 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.610871 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-console-config\") pod \"45889600-158b-4b62-912f-d6d90cd827d4\" (UID: \"45889600-158b-4b62-912f-d6d90cd827d4\") " Apr 17 10:19:22.611061 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.611019 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "45889600-158b-4b62-912f-d6d90cd827d4" (UID: "45889600-158b-4b62-912f-d6d90cd827d4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 10:19:22.611061 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.611026 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "45889600-158b-4b62-912f-d6d90cd827d4" (UID: "45889600-158b-4b62-912f-d6d90cd827d4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 10:19:22.611212 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.611193 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-trusted-ca-bundle\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:22.611276 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.611220 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-oauth-serving-cert\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:22.611326 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.611289 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-service-ca" (OuterVolumeSpecName: "service-ca") pod "45889600-158b-4b62-912f-d6d90cd827d4" (UID: "45889600-158b-4b62-912f-d6d90cd827d4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 10:19:22.611379 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.611337 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-console-config" (OuterVolumeSpecName: "console-config") pod "45889600-158b-4b62-912f-d6d90cd827d4" (UID: "45889600-158b-4b62-912f-d6d90cd827d4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 10:19:22.612948 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.612920 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "45889600-158b-4b62-912f-d6d90cd827d4" (UID: "45889600-158b-4b62-912f-d6d90cd827d4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 10:19:22.613050 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.612945 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "45889600-158b-4b62-912f-d6d90cd827d4" (UID: "45889600-158b-4b62-912f-d6d90cd827d4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 10:19:22.613050 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.613005 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45889600-158b-4b62-912f-d6d90cd827d4-kube-api-access-8g5wr" (OuterVolumeSpecName: "kube-api-access-8g5wr") pod "45889600-158b-4b62-912f-d6d90cd827d4" (UID: "45889600-158b-4b62-912f-d6d90cd827d4"). InnerVolumeSpecName "kube-api-access-8g5wr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 10:19:22.712274 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.712222 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8g5wr\" (UniqueName: \"kubernetes.io/projected/45889600-158b-4b62-912f-d6d90cd827d4-kube-api-access-8g5wr\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:22.712274 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.712266 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-oauth-config\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:22.712274 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.712277 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45889600-158b-4b62-912f-d6d90cd827d4-console-serving-cert\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:22.712274 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.712287 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-service-ca\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:22.712274 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:22.712297 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45889600-158b-4b62-912f-d6d90cd827d4-console-config\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:19:23.379836 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.379809 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fdc7f988b-2867z_45889600-158b-4b62-912f-d6d90cd827d4/console/0.log" Apr 17 10:19:23.380230 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.379852 2579 generic.go:358] "Generic (PLEG): container finished" podID="45889600-158b-4b62-912f-d6d90cd827d4" containerID="33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9" exitCode=2 Apr 17 10:19:23.380230 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.379922 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdc7f988b-2867z" event={"ID":"45889600-158b-4b62-912f-d6d90cd827d4","Type":"ContainerDied","Data":"33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9"} Apr 17 10:19:23.380230 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.379953 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdc7f988b-2867z" event={"ID":"45889600-158b-4b62-912f-d6d90cd827d4","Type":"ContainerDied","Data":"dee2d30572f3f3e76b23d373003e88bae54e2ba81fcc9db2f15ba50a6bd89699"} Apr 17 10:19:23.380230 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.379930 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdc7f988b-2867z" Apr 17 10:19:23.380230 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.379969 2579 scope.go:117] "RemoveContainer" containerID="33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9" Apr 17 10:19:23.381403 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.381381 2579 generic.go:358] "Generic (PLEG): container finished" podID="0837740f-2592-494a-bec7-ad834d8b557e" containerID="d6357b18aadea88eb8faa5a70f51d7d98c1c1a360ca8bbe9a11cfc2392d7a8f8" exitCode=0 Apr 17 10:19:23.381499 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.381460 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" event={"ID":"0837740f-2592-494a-bec7-ad834d8b557e","Type":"ContainerDied","Data":"d6357b18aadea88eb8faa5a70f51d7d98c1c1a360ca8bbe9a11cfc2392d7a8f8"} Apr 17 10:19:23.381756 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.381743 2579 scope.go:117] "RemoveContainer" containerID="d6357b18aadea88eb8faa5a70f51d7d98c1c1a360ca8bbe9a11cfc2392d7a8f8" Apr 17 10:19:23.396044 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.395939 2579 scope.go:117] "RemoveContainer" containerID="33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9" Apr 17 10:19:23.397488 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:19:23.397448 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9\": container with ID starting with 33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9 not found: ID does not exist" containerID="33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9" Apr 17 10:19:23.397589 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.397495 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9"} err="failed to get container status \"33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9\": rpc error: code = NotFound desc = could not find container \"33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9\": container with ID starting with 33c0819f494950ae372392b3fc596489c35eed327e9274ff84b001e7406dc1e9 not found: ID does not exist" Apr 17 10:19:23.399601 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.399557 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fdc7f988b-2867z"] Apr 17 10:19:23.401143 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:23.401121 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fdc7f988b-2867z"] Apr 17 10:19:24.385854 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:24.385815 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-fhlm5" event={"ID":"0837740f-2592-494a-bec7-ad834d8b557e","Type":"ContainerStarted","Data":"0709855ff117b3a1a7db460398a1d54c2b31a00231383776e748b78234f77ce0"} Apr 17 10:19:24.868508 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:24.868472 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45889600-158b-4b62-912f-d6d90cd827d4" path="/var/lib/kubelet/pods/45889600-158b-4b62-912f-d6d90cd827d4/volumes" Apr 17 10:19:28.222867 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:28.222838 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:19:28.226749 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:28.226727 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6f79cf884b-wdz2t" Apr 17 10:19:56.093155 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.093122 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 10:19:56.093628 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.093435 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45889600-158b-4b62-912f-d6d90cd827d4" containerName="console" Apr 17 10:19:56.093628 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.093448 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="45889600-158b-4b62-912f-d6d90cd827d4" containerName="console" Apr 17 10:19:56.093628 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.093458 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fff2f824-dc09-4d90-817f-04bff462678a" containerName="registry" Apr 17 10:19:56.093628 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.093464 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff2f824-dc09-4d90-817f-04bff462678a" containerName="registry" Apr 17 10:19:56.093628 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.093528 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="45889600-158b-4b62-912f-d6d90cd827d4" containerName="console" Apr 17 10:19:56.093628 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.093536 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="fff2f824-dc09-4d90-817f-04bff462678a" containerName="registry" Apr 17 10:19:56.098320 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.098302 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.100667 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.100635 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 10:19:56.100797 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.100669 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 10:19:56.100916 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.100900 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 10:19:56.100916 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.100911 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-lgppc\"" Apr 17 10:19:56.101034 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.100925 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 10:19:56.101034 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.101016 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 10:19:56.101034 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.101029 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 10:19:56.101173 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.101044 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 10:19:56.101173 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.101141 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 10:19:56.105545 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.105526 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 10:19:56.108294 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.108270 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 10:19:56.187694 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.187654 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.187694 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.187694 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-config-volume\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.187937 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.187724 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.187937 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.187748 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/94d89ba1-46d8-4c66-bf40-66f7dc002163-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.187937 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.187776 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.187937 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.187799 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgz8q\" (UniqueName: \"kubernetes.io/projected/94d89ba1-46d8-4c66-bf40-66f7dc002163-kube-api-access-lgz8q\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.187937 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.187842 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94d89ba1-46d8-4c66-bf40-66f7dc002163-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.187937 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.187913 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.188145 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.187940 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-web-config\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.188145 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.187972 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.188145 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.188018 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94d89ba1-46d8-4c66-bf40-66f7dc002163-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.188145 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.188041 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94d89ba1-46d8-4c66-bf40-66f7dc002163-tls-assets\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.188145 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.188061 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94d89ba1-46d8-4c66-bf40-66f7dc002163-config-out\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.288806 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.288769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.288806 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.288807 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-web-config\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289015 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.288859 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289015 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.288909 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94d89ba1-46d8-4c66-bf40-66f7dc002163-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289015 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.288937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94d89ba1-46d8-4c66-bf40-66f7dc002163-tls-assets\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289015 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.288959 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94d89ba1-46d8-4c66-bf40-66f7dc002163-config-out\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289015 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.288992 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289261 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.289166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-config-volume\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289261 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.289202 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289261 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.289234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/94d89ba1-46d8-4c66-bf40-66f7dc002163-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289407 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.289264 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289407 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.289290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgz8q\" (UniqueName: \"kubernetes.io/projected/94d89ba1-46d8-4c66-bf40-66f7dc002163-kube-api-access-lgz8q\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.289407 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.289320 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94d89ba1-46d8-4c66-bf40-66f7dc002163-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.290124 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.289792 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/94d89ba1-46d8-4c66-bf40-66f7dc002163-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.290530 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.290507 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94d89ba1-46d8-4c66-bf40-66f7dc002163-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.292358 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.292047 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94d89ba1-46d8-4c66-bf40-66f7dc002163-tls-assets\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.292358 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.292057 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94d89ba1-46d8-4c66-bf40-66f7dc002163-config-out\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.292358 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.292081 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.292358 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.292058 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-web-config\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.292358 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.292316 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/94d89ba1-46d8-4c66-bf40-66f7dc002163-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.292618 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.292372 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.292618 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.292539 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.292747 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.292725 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-config-volume\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.293052 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.293035 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.293988 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.293966 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/94d89ba1-46d8-4c66-bf40-66f7dc002163-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.299679 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.299660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgz8q\" (UniqueName: \"kubernetes.io/projected/94d89ba1-46d8-4c66-bf40-66f7dc002163-kube-api-access-lgz8q\") pod \"alertmanager-main-0\" (UID: \"94d89ba1-46d8-4c66-bf40-66f7dc002163\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.409743 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.409658 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 10:19:56.542173 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:56.542139 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 10:19:56.545711 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:19:56.545684 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94d89ba1_46d8_4c66_bf40_66f7dc002163.slice/crio-b6a6b1477f17b310a1f0871188399a9734ed60a4df9542f396a6e7893c041da6 WatchSource:0}: Error finding container b6a6b1477f17b310a1f0871188399a9734ed60a4df9542f396a6e7893c041da6: Status 404 returned error can't find the container with id b6a6b1477f17b310a1f0871188399a9734ed60a4df9542f396a6e7893c041da6 Apr 17 10:19:57.486392 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:57.486352 2579 generic.go:358] "Generic (PLEG): container finished" podID="94d89ba1-46d8-4c66-bf40-66f7dc002163" containerID="69485bb0216a6c24b076c5906ff1d5af489dd92b0f43a79c5e82de947ef6405e" exitCode=0 Apr 17 10:19:57.486802 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:57.486408 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"94d89ba1-46d8-4c66-bf40-66f7dc002163","Type":"ContainerDied","Data":"69485bb0216a6c24b076c5906ff1d5af489dd92b0f43a79c5e82de947ef6405e"} Apr 17 10:19:57.486802 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:57.486451 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"94d89ba1-46d8-4c66-bf40-66f7dc002163","Type":"ContainerStarted","Data":"b6a6b1477f17b310a1f0871188399a9734ed60a4df9542f396a6e7893c041da6"} Apr 17 10:19:59.497735 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:59.497703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"94d89ba1-46d8-4c66-bf40-66f7dc002163","Type":"ContainerStarted","Data":"2c29fbfd35ffbbb9baed881f71ff376a04070807c11b0eba65ad9380463b1f64"} Apr 17 10:19:59.497735 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:59.497738 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"94d89ba1-46d8-4c66-bf40-66f7dc002163","Type":"ContainerStarted","Data":"0e6e49005962b151ff92afcdfa95f083542a936d85aafe96305c29241be51429"} Apr 17 10:19:59.498146 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:59.497751 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"94d89ba1-46d8-4c66-bf40-66f7dc002163","Type":"ContainerStarted","Data":"f132647c32f8ce4df7ba329db92640f8511d4b99549246e89887d6ccd40ffb7d"} Apr 17 10:19:59.498146 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:59.497759 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"94d89ba1-46d8-4c66-bf40-66f7dc002163","Type":"ContainerStarted","Data":"21043b6b34e47e990e9f7c0340cba632a2c07eacf999e1ad32171fc53f0a5169"} Apr 17 10:19:59.498146 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:59.497768 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"94d89ba1-46d8-4c66-bf40-66f7dc002163","Type":"ContainerStarted","Data":"435b7e53d5d7dc989a0df7775bfdd55109ce6cc3125c9f16e7d8c64b12e79b69"} Apr 17 10:19:59.498146 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:59.497776 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"94d89ba1-46d8-4c66-bf40-66f7dc002163","Type":"ContainerStarted","Data":"b9c0e88d3d4b19cbd3e14a2ba16d5dd99e6850a9a4ea53d4b83057e8576602f8"} Apr 17 10:19:59.527152 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:19:59.527094 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.246170129 podStartE2EDuration="3.527075749s" podCreationTimestamp="2026-04-17 10:19:56 +0000 UTC" firstStartedPulling="2026-04-17 10:19:57.488119787 +0000 UTC m=+157.224306120" lastFinishedPulling="2026-04-17 10:19:58.769025396 +0000 UTC m=+158.505211740" observedRunningTime="2026-04-17 10:19:59.524862431 +0000 UTC m=+159.261048783" watchObservedRunningTime="2026-04-17 10:19:59.527075749 +0000 UTC m=+159.263262139" Apr 17 10:20:14.030269 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.030228 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b7c9cbb88-bbfdl"] Apr 17 10:20:14.035895 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.035859 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.038735 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.038694 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 10:20:14.038893 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.038695 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 10:20:14.038893 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.038700 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 10:20:14.039013 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.038928 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4b9bn\"" Apr 17 10:20:14.039125 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.039103 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 10:20:14.039730 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.039712 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 10:20:14.039934 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.039918 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 10:20:14.040008 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.039957 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 10:20:14.044675 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.044453 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 10:20:14.046514 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.046472 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b7c9cbb88-bbfdl"] Apr 17 10:20:14.137751 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.137715 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-oauth-config\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.137751 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.137752 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-trusted-ca-bundle\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.137983 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.137820 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-oauth-serving-cert\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.137983 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.137849 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-serving-cert\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.137983 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.137883 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-config\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.137983 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.137900 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvc2l\" (UniqueName: \"kubernetes.io/projected/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-kube-api-access-bvc2l\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.137983 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.137930 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-service-ca\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.239204 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.239169 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-oauth-config\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.239204 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.239205 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-trusted-ca-bundle\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.239429 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.239327 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-oauth-serving-cert\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.239429 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.239355 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-serving-cert\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.239429 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.239376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-config\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.239429 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.239393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvc2l\" (UniqueName: \"kubernetes.io/projected/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-kube-api-access-bvc2l\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.239640 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.239516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-service-ca\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.240197 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.240167 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-oauth-serving-cert\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.240341 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.240315 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-service-ca\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.240451 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.240406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-trusted-ca-bundle\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.240634 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.240617 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-config\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.241840 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.241807 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-oauth-config\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.241954 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.241937 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-serving-cert\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.247078 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.247055 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvc2l\" (UniqueName: \"kubernetes.io/projected/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-kube-api-access-bvc2l\") pod \"console-7b7c9cbb88-bbfdl\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.349411 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.349314 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:14.472760 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.472727 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b7c9cbb88-bbfdl"] Apr 17 10:20:14.475821 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:20:14.475791 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e82d377_1ba5_4e0d_8823_e3b577dfd1c4.slice/crio-417e1a39e5a3094b88486633f53d4e10807899eff66a53f68e454f05498fc38c WatchSource:0}: Error finding container 417e1a39e5a3094b88486633f53d4e10807899eff66a53f68e454f05498fc38c: Status 404 returned error can't find the container with id 417e1a39e5a3094b88486633f53d4e10807899eff66a53f68e454f05498fc38c Apr 17 10:20:14.542735 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.542696 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b7c9cbb88-bbfdl" event={"ID":"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4","Type":"ContainerStarted","Data":"626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f"} Apr 17 10:20:14.542735 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.542744 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b7c9cbb88-bbfdl" event={"ID":"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4","Type":"ContainerStarted","Data":"417e1a39e5a3094b88486633f53d4e10807899eff66a53f68e454f05498fc38c"} Apr 17 10:20:14.559759 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:14.559708 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b7c9cbb88-bbfdl" podStartSLOduration=0.559690328 podStartE2EDuration="559.690328ms" podCreationTimestamp="2026-04-17 10:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 10:20:14.558542877 +0000 UTC m=+174.294729229" watchObservedRunningTime="2026-04-17 10:20:14.559690328 +0000 UTC m=+174.295876681" Apr 17 10:20:24.350065 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:24.350020 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:24.350065 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:24.350070 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:24.354827 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:24.354801 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:24.577409 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:24.577377 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:20:58.795525 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.795493 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd"] Apr 17 10:20:58.798965 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.798949 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:58.801431 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.801394 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-gprb4\"" Apr 17 10:20:58.801491 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.801443 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 10:20:58.801491 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.801456 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 10:20:58.806664 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.806642 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd"] Apr 17 10:20:58.822683 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.822652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:58.822819 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.822693 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:58.822819 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.822743 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpm5h\" (UniqueName: \"kubernetes.io/projected/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-kube-api-access-bpm5h\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:58.923775 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.923736 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:58.923775 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.923775 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:58.924039 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.923803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpm5h\" (UniqueName: \"kubernetes.io/projected/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-kube-api-access-bpm5h\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:58.924155 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.924135 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:58.924228 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.924208 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:58.934680 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:58.934647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpm5h\" (UniqueName: \"kubernetes.io/projected/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-kube-api-access-bpm5h\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:59.109278 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:59.109195 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:20:59.229276 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:59.229247 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd"] Apr 17 10:20:59.231840 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:20:59.231812 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cc17bd8_0f18_4de5_ae2d_8fd44f51847d.slice/crio-1bcc945ac1dcafddeecfe8d56178976fea7d152ff4e123a1149b0d389d54b1e2 WatchSource:0}: Error finding container 1bcc945ac1dcafddeecfe8d56178976fea7d152ff4e123a1149b0d389d54b1e2: Status 404 returned error can't find the container with id 1bcc945ac1dcafddeecfe8d56178976fea7d152ff4e123a1149b0d389d54b1e2 Apr 17 10:20:59.683718 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:20:59.683674 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" event={"ID":"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d","Type":"ContainerStarted","Data":"1bcc945ac1dcafddeecfe8d56178976fea7d152ff4e123a1149b0d389d54b1e2"} Apr 17 10:21:04.701830 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:04.701791 2579 generic.go:358] "Generic (PLEG): container finished" podID="3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" containerID="7068c108a363bc4044e8a5732aa20a17fdeb803896053e8569ac1a3b2e6eecd5" exitCode=0 Apr 17 10:21:04.702263 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:04.701877 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" event={"ID":"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d","Type":"ContainerDied","Data":"7068c108a363bc4044e8a5732aa20a17fdeb803896053e8569ac1a3b2e6eecd5"} Apr 17 10:21:07.714273 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:07.714241 2579 generic.go:358] "Generic (PLEG): container finished" podID="3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" containerID="255d8e0ad4840733a24a27ae415bdbdde469721ac0b5ae439cd103bb33b8ad22" exitCode=0 Apr 17 10:21:07.714707 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:07.714329 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" event={"ID":"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d","Type":"ContainerDied","Data":"255d8e0ad4840733a24a27ae415bdbdde469721ac0b5ae439cd103bb33b8ad22"} Apr 17 10:21:13.736008 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:13.735919 2579 generic.go:358] "Generic (PLEG): container finished" podID="3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" containerID="fd47a29678a4f7c5f5c39e698677f01e3810a426e9bb006f07b8887682cb9d28" exitCode=0 Apr 17 10:21:13.736358 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:13.736025 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" event={"ID":"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d","Type":"ContainerDied","Data":"fd47a29678a4f7c5f5c39e698677f01e3810a426e9bb006f07b8887682cb9d28"} Apr 17 10:21:14.859689 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:14.859651 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:21:14.974225 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:14.974185 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-util\") pod \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " Apr 17 10:21:14.974225 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:14.974233 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-bundle\") pod \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " Apr 17 10:21:14.974514 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:14.974259 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpm5h\" (UniqueName: \"kubernetes.io/projected/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-kube-api-access-bpm5h\") pod \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\" (UID: \"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d\") " Apr 17 10:21:14.974950 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:14.974922 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-bundle" (OuterVolumeSpecName: "bundle") pod "3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" (UID: "3cc17bd8-0f18-4de5-ae2d-8fd44f51847d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 10:21:14.976578 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:14.976551 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-kube-api-access-bpm5h" (OuterVolumeSpecName: "kube-api-access-bpm5h") pod "3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" (UID: "3cc17bd8-0f18-4de5-ae2d-8fd44f51847d"). InnerVolumeSpecName "kube-api-access-bpm5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 10:21:14.979174 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:14.979154 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-util" (OuterVolumeSpecName: "util") pod "3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" (UID: "3cc17bd8-0f18-4de5-ae2d-8fd44f51847d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 10:21:15.075496 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:15.075460 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-util\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:21:15.075496 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:15.075489 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-bundle\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:21:15.075496 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:15.075500 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bpm5h\" (UniqueName: \"kubernetes.io/projected/3cc17bd8-0f18-4de5-ae2d-8fd44f51847d-kube-api-access-bpm5h\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:21:15.742824 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:15.742791 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" event={"ID":"3cc17bd8-0f18-4de5-ae2d-8fd44f51847d","Type":"ContainerDied","Data":"1bcc945ac1dcafddeecfe8d56178976fea7d152ff4e123a1149b0d389d54b1e2"} Apr 17 10:21:15.742824 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:15.742826 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bcc945ac1dcafddeecfe8d56178976fea7d152ff4e123a1149b0d389d54b1e2" Apr 17 10:21:15.743021 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:15.742839 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e525qbd" Apr 17 10:21:21.164405 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.164373 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw"] Apr 17 10:21:21.164814 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.164709 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" containerName="pull" Apr 17 10:21:21.164814 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.164721 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" containerName="pull" Apr 17 10:21:21.164814 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.164741 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" containerName="util" Apr 17 10:21:21.164814 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.164746 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" containerName="util" Apr 17 10:21:21.164814 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.164757 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" containerName="extract" Apr 17 10:21:21.164814 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.164763 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" containerName="extract" Apr 17 10:21:21.164814 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.164810 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3cc17bd8-0f18-4de5-ae2d-8fd44f51847d" containerName="extract" Apr 17 10:21:21.208393 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.208347 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw"] Apr 17 10:21:21.208583 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.208504 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" Apr 17 10:21:21.211076 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.211052 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-qft7m\"" Apr 17 10:21:21.211202 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.211053 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 10:21:21.211202 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.211054 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 10:21:21.327240 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.327202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab91c3a8-534b-4d42-8469-01debcafb711-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-9h6sw\" (UID: \"ab91c3a8-534b-4d42-8469-01debcafb711\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" Apr 17 10:21:21.327410 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.327256 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b5pw\" (UniqueName: \"kubernetes.io/projected/ab91c3a8-534b-4d42-8469-01debcafb711-kube-api-access-4b5pw\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-9h6sw\" (UID: \"ab91c3a8-534b-4d42-8469-01debcafb711\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" Apr 17 10:21:21.428115 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.428035 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab91c3a8-534b-4d42-8469-01debcafb711-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-9h6sw\" (UID: \"ab91c3a8-534b-4d42-8469-01debcafb711\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" Apr 17 10:21:21.428115 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.428090 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4b5pw\" (UniqueName: \"kubernetes.io/projected/ab91c3a8-534b-4d42-8469-01debcafb711-kube-api-access-4b5pw\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-9h6sw\" (UID: \"ab91c3a8-534b-4d42-8469-01debcafb711\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" Apr 17 10:21:21.428440 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.428389 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab91c3a8-534b-4d42-8469-01debcafb711-tmp\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-9h6sw\" (UID: \"ab91c3a8-534b-4d42-8469-01debcafb711\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" Apr 17 10:21:21.435915 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.435885 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b5pw\" (UniqueName: \"kubernetes.io/projected/ab91c3a8-534b-4d42-8469-01debcafb711-kube-api-access-4b5pw\") pod \"cert-manager-operator-controller-manager-7c5b8bd68-9h6sw\" (UID: \"ab91c3a8-534b-4d42-8469-01debcafb711\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" Apr 17 10:21:21.517477 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.517442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" Apr 17 10:21:21.645429 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.645390 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw"] Apr 17 10:21:21.647636 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:21:21.647604 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab91c3a8_534b_4d42_8469_01debcafb711.slice/crio-146c5bc1847f3f746cffd2f067d523ded42fabca8970c4f8ab3207a29fa8796d WatchSource:0}: Error finding container 146c5bc1847f3f746cffd2f067d523ded42fabca8970c4f8ab3207a29fa8796d: Status 404 returned error can't find the container with id 146c5bc1847f3f746cffd2f067d523ded42fabca8970c4f8ab3207a29fa8796d Apr 17 10:21:21.764545 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:21.764455 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" event={"ID":"ab91c3a8-534b-4d42-8469-01debcafb711","Type":"ContainerStarted","Data":"146c5bc1847f3f746cffd2f067d523ded42fabca8970c4f8ab3207a29fa8796d"} Apr 17 10:21:24.779973 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:24.779930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" event={"ID":"ab91c3a8-534b-4d42-8469-01debcafb711","Type":"ContainerStarted","Data":"6385c8e120ecdcb51f4cfd24956af7e61cf4817d23df4e6f82e6663c13282e05"} Apr 17 10:21:24.802300 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:24.802234 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7c5b8bd68-9h6sw" podStartSLOduration=1.528461596 podStartE2EDuration="3.802213249s" podCreationTimestamp="2026-04-17 10:21:21 +0000 UTC" firstStartedPulling="2026-04-17 10:21:21.650164266 +0000 UTC m=+241.386350597" lastFinishedPulling="2026-04-17 10:21:23.92391591 +0000 UTC m=+243.660102250" observedRunningTime="2026-04-17 10:21:24.800396257 +0000 UTC m=+244.536582633" watchObservedRunningTime="2026-04-17 10:21:24.802213249 +0000 UTC m=+244.538399603" Apr 17 10:21:42.762956 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:42.762866 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f"] Apr 17 10:21:42.766341 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:42.766320 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:42.768624 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:42.768592 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 10:21:42.768754 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:42.768644 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-gprb4\"" Apr 17 10:21:42.769448 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:42.769411 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 10:21:42.773436 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:42.773396 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f"] Apr 17 10:21:42.920127 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:42.920083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz4zx\" (UniqueName: \"kubernetes.io/projected/80916018-7c56-4e06-8526-15b6afd3d573-kube-api-access-nz4zx\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:42.920315 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:42.920155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:42.920315 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:42.920224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:43.021337 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.021303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:43.021535 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.021388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz4zx\" (UniqueName: \"kubernetes.io/projected/80916018-7c56-4e06-8526-15b6afd3d573-kube-api-access-nz4zx\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:43.021535 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.021465 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:43.021729 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.021706 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-util\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:43.021791 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.021737 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-bundle\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:43.029376 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.029340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz4zx\" (UniqueName: \"kubernetes.io/projected/80916018-7c56-4e06-8526-15b6afd3d573-kube-api-access-nz4zx\") pod \"c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:43.077008 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.076971 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:43.203888 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.203857 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f"] Apr 17 10:21:43.206397 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:21:43.206362 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80916018_7c56_4e06_8526_15b6afd3d573.slice/crio-1cdbf34152e9b43d3e01a877f6af66ca6ede3d13baa79aecbf531e4896404db8 WatchSource:0}: Error finding container 1cdbf34152e9b43d3e01a877f6af66ca6ede3d13baa79aecbf531e4896404db8: Status 404 returned error can't find the container with id 1cdbf34152e9b43d3e01a877f6af66ca6ede3d13baa79aecbf531e4896404db8 Apr 17 10:21:43.845362 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.845320 2579 generic.go:358] "Generic (PLEG): container finished" podID="80916018-7c56-4e06-8526-15b6afd3d573" containerID="2ca72b072185d4f2227858f39997cdb3c6178036cf0daa0d2192946d2979e8a7" exitCode=0 Apr 17 10:21:43.845792 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.845403 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" event={"ID":"80916018-7c56-4e06-8526-15b6afd3d573","Type":"ContainerDied","Data":"2ca72b072185d4f2227858f39997cdb3c6178036cf0daa0d2192946d2979e8a7"} Apr 17 10:21:43.845792 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:43.845467 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" event={"ID":"80916018-7c56-4e06-8526-15b6afd3d573","Type":"ContainerStarted","Data":"1cdbf34152e9b43d3e01a877f6af66ca6ede3d13baa79aecbf531e4896404db8"} Apr 17 10:21:45.854328 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:45.854292 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" event={"ID":"80916018-7c56-4e06-8526-15b6afd3d573","Type":"ContainerStarted","Data":"30a4b6c6dbaf90a087cf5df6435adb231384262f22f89ac6a9ac28de3b6ff5e8"} Apr 17 10:21:46.859970 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:46.859933 2579 generic.go:358] "Generic (PLEG): container finished" podID="80916018-7c56-4e06-8526-15b6afd3d573" containerID="30a4b6c6dbaf90a087cf5df6435adb231384262f22f89ac6a9ac28de3b6ff5e8" exitCode=0 Apr 17 10:21:46.860348 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:46.860012 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" event={"ID":"80916018-7c56-4e06-8526-15b6afd3d573","Type":"ContainerDied","Data":"30a4b6c6dbaf90a087cf5df6435adb231384262f22f89ac6a9ac28de3b6ff5e8"} Apr 17 10:21:47.868411 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:47.868378 2579 generic.go:358] "Generic (PLEG): container finished" podID="80916018-7c56-4e06-8526-15b6afd3d573" containerID="8cbba03d04aa3a96810f8ebac77d7b2890db1c50108fec4948597d34833539ce" exitCode=0 Apr 17 10:21:47.868786 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:47.868446 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" event={"ID":"80916018-7c56-4e06-8526-15b6afd3d573","Type":"ContainerDied","Data":"8cbba03d04aa3a96810f8ebac77d7b2890db1c50108fec4948597d34833539ce"} Apr 17 10:21:48.998934 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:48.998906 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:49.069585 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.069550 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-bundle\") pod \"80916018-7c56-4e06-8526-15b6afd3d573\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " Apr 17 10:21:49.069787 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.069663 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz4zx\" (UniqueName: \"kubernetes.io/projected/80916018-7c56-4e06-8526-15b6afd3d573-kube-api-access-nz4zx\") pod \"80916018-7c56-4e06-8526-15b6afd3d573\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " Apr 17 10:21:49.069787 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.069713 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-util\") pod \"80916018-7c56-4e06-8526-15b6afd3d573\" (UID: \"80916018-7c56-4e06-8526-15b6afd3d573\") " Apr 17 10:21:49.070042 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.070010 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-bundle" (OuterVolumeSpecName: "bundle") pod "80916018-7c56-4e06-8526-15b6afd3d573" (UID: "80916018-7c56-4e06-8526-15b6afd3d573"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 10:21:49.071761 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.071724 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80916018-7c56-4e06-8526-15b6afd3d573-kube-api-access-nz4zx" (OuterVolumeSpecName: "kube-api-access-nz4zx") pod "80916018-7c56-4e06-8526-15b6afd3d573" (UID: "80916018-7c56-4e06-8526-15b6afd3d573"). InnerVolumeSpecName "kube-api-access-nz4zx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 10:21:49.074471 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.074438 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-util" (OuterVolumeSpecName: "util") pod "80916018-7c56-4e06-8526-15b6afd3d573" (UID: "80916018-7c56-4e06-8526-15b6afd3d573"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 10:21:49.171078 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.170978 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nz4zx\" (UniqueName: \"kubernetes.io/projected/80916018-7c56-4e06-8526-15b6afd3d573-kube-api-access-nz4zx\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:21:49.171078 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.171012 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-util\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:21:49.171078 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.171022 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80916018-7c56-4e06-8526-15b6afd3d573-bundle\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:21:49.877006 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.876975 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" Apr 17 10:21:49.877177 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.876976 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/c2ca89134faa49158137edb0141b62ea0c6a854657aff316cf72d9c78evzv6f" event={"ID":"80916018-7c56-4e06-8526-15b6afd3d573","Type":"ContainerDied","Data":"1cdbf34152e9b43d3e01a877f6af66ca6ede3d13baa79aecbf531e4896404db8"} Apr 17 10:21:49.877177 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:49.877086 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cdbf34152e9b43d3e01a877f6af66ca6ede3d13baa79aecbf531e4896404db8" Apr 17 10:21:55.396654 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.396613 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w"] Apr 17 10:21:55.397028 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.396950 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80916018-7c56-4e06-8526-15b6afd3d573" containerName="extract" Apr 17 10:21:55.397028 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.396961 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="80916018-7c56-4e06-8526-15b6afd3d573" containerName="extract" Apr 17 10:21:55.397028 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.396983 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80916018-7c56-4e06-8526-15b6afd3d573" containerName="pull" Apr 17 10:21:55.397028 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.396989 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="80916018-7c56-4e06-8526-15b6afd3d573" containerName="pull" Apr 17 10:21:55.397028 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.397002 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80916018-7c56-4e06-8526-15b6afd3d573" containerName="util" Apr 17 10:21:55.397028 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.397018 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="80916018-7c56-4e06-8526-15b6afd3d573" containerName="util" Apr 17 10:21:55.397211 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.397066 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="80916018-7c56-4e06-8526-15b6afd3d573" containerName="extract" Apr 17 10:21:55.401655 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.401635 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" Apr 17 10:21:55.404180 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.404153 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-k2zwk\"" Apr 17 10:21:55.404325 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.404185 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 17 10:21:55.404325 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.404159 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 17 10:21:55.410158 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.410131 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w"] Apr 17 10:21:55.432544 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.432510 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c507b879-9e2a-412e-9f22-5413cf89d98f-tmp\") pod \"jobset-operator-747c5859c7-gcn2w\" (UID: \"c507b879-9e2a-412e-9f22-5413cf89d98f\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" Apr 17 10:21:55.432759 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.432555 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjf7n\" (UniqueName: \"kubernetes.io/projected/c507b879-9e2a-412e-9f22-5413cf89d98f-kube-api-access-tjf7n\") pod \"jobset-operator-747c5859c7-gcn2w\" (UID: \"c507b879-9e2a-412e-9f22-5413cf89d98f\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" Apr 17 10:21:55.533663 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.533619 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c507b879-9e2a-412e-9f22-5413cf89d98f-tmp\") pod \"jobset-operator-747c5859c7-gcn2w\" (UID: \"c507b879-9e2a-412e-9f22-5413cf89d98f\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" Apr 17 10:21:55.533822 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.533672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjf7n\" (UniqueName: \"kubernetes.io/projected/c507b879-9e2a-412e-9f22-5413cf89d98f-kube-api-access-tjf7n\") pod \"jobset-operator-747c5859c7-gcn2w\" (UID: \"c507b879-9e2a-412e-9f22-5413cf89d98f\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" Apr 17 10:21:55.534021 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.534003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c507b879-9e2a-412e-9f22-5413cf89d98f-tmp\") pod \"jobset-operator-747c5859c7-gcn2w\" (UID: \"c507b879-9e2a-412e-9f22-5413cf89d98f\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" Apr 17 10:21:55.542294 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.542264 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjf7n\" (UniqueName: \"kubernetes.io/projected/c507b879-9e2a-412e-9f22-5413cf89d98f-kube-api-access-tjf7n\") pod \"jobset-operator-747c5859c7-gcn2w\" (UID: \"c507b879-9e2a-412e-9f22-5413cf89d98f\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" Apr 17 10:21:55.725502 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.725395 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" Apr 17 10:21:55.843485 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.843457 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w"] Apr 17 10:21:55.846468 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:21:55.846438 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc507b879_9e2a_412e_9f22_5413cf89d98f.slice/crio-4307acdc82e5599eba83d3c60cabc8503e4d7116d2cfa42bcc5dfe5149bf1e1f WatchSource:0}: Error finding container 4307acdc82e5599eba83d3c60cabc8503e4d7116d2cfa42bcc5dfe5149bf1e1f: Status 404 returned error can't find the container with id 4307acdc82e5599eba83d3c60cabc8503e4d7116d2cfa42bcc5dfe5149bf1e1f Apr 17 10:21:55.896774 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:55.896735 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" event={"ID":"c507b879-9e2a-412e-9f22-5413cf89d98f","Type":"ContainerStarted","Data":"4307acdc82e5599eba83d3c60cabc8503e4d7116d2cfa42bcc5dfe5149bf1e1f"} Apr 17 10:21:57.905666 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:57.905632 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" event={"ID":"c507b879-9e2a-412e-9f22-5413cf89d98f","Type":"ContainerStarted","Data":"004618ecfe96c7aa362f76fcfbd6d69155a6e0f52d05036ddc1f5f67fe527e60"} Apr 17 10:21:57.922892 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:21:57.922834 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-gcn2w" podStartSLOduration=1.220755709 podStartE2EDuration="2.922813447s" podCreationTimestamp="2026-04-17 10:21:55 +0000 UTC" firstStartedPulling="2026-04-17 10:21:55.848257282 +0000 UTC m=+275.584443614" lastFinishedPulling="2026-04-17 10:21:57.550315018 +0000 UTC m=+277.286501352" observedRunningTime="2026-04-17 10:21:57.919908281 +0000 UTC m=+277.656094719" watchObservedRunningTime="2026-04-17 10:21:57.922813447 +0000 UTC m=+277.658999800" Apr 17 10:22:11.497711 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.497676 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h"] Apr 17 10:22:11.501054 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.501034 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.503282 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.503256 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 17 10:22:11.504182 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.504162 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 17 10:22:11.504296 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.504162 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 17 10:22:11.504296 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.504261 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-m5748\"" Apr 17 10:22:11.509741 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.509716 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h"] Apr 17 10:22:11.575615 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.575578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxqns\" (UniqueName: \"kubernetes.io/projected/12e6d84a-d886-4010-833b-673f093c07cf-kube-api-access-nxqns\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.575829 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.575721 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/12e6d84a-d886-4010-833b-673f093c07cf-manager-config\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.575829 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.575776 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12e6d84a-d886-4010-833b-673f093c07cf-cert\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.575829 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.575810 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12e6d84a-d886-4010-833b-673f093c07cf-metrics-certs\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.676277 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.676234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12e6d84a-d886-4010-833b-673f093c07cf-cert\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.676277 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.676281 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12e6d84a-d886-4010-833b-673f093c07cf-metrics-certs\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.676557 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.676325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxqns\" (UniqueName: \"kubernetes.io/projected/12e6d84a-d886-4010-833b-673f093c07cf-kube-api-access-nxqns\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.676557 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.676425 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/12e6d84a-d886-4010-833b-673f093c07cf-manager-config\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.677060 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.677036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/12e6d84a-d886-4010-833b-673f093c07cf-manager-config\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.678857 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.678825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/12e6d84a-d886-4010-833b-673f093c07cf-cert\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.678942 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.678921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/12e6d84a-d886-4010-833b-673f093c07cf-metrics-certs\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.684068 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.684046 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxqns\" (UniqueName: \"kubernetes.io/projected/12e6d84a-d886-4010-833b-673f093c07cf-kube-api-access-nxqns\") pod \"jobset-controller-manager-55f595dffd-wst4h\" (UID: \"12e6d84a-d886-4010-833b-673f093c07cf\") " pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.812580 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.812546 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:11.936767 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.936739 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h"] Apr 17 10:22:11.938961 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:22:11.938928 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12e6d84a_d886_4010_833b_673f093c07cf.slice/crio-ae18e2fb5ae6287a5e7d1edbebcbb6b1e6791e2bb28db8b890aab440466b1210 WatchSource:0}: Error finding container ae18e2fb5ae6287a5e7d1edbebcbb6b1e6791e2bb28db8b890aab440466b1210: Status 404 returned error can't find the container with id ae18e2fb5ae6287a5e7d1edbebcbb6b1e6791e2bb28db8b890aab440466b1210 Apr 17 10:22:11.958066 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:11.958033 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" event={"ID":"12e6d84a-d886-4010-833b-673f093c07cf","Type":"ContainerStarted","Data":"ae18e2fb5ae6287a5e7d1edbebcbb6b1e6791e2bb28db8b890aab440466b1210"} Apr 17 10:22:14.970616 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:14.970573 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" event={"ID":"12e6d84a-d886-4010-833b-673f093c07cf","Type":"ContainerStarted","Data":"5bb46c4e009b30e2d1e1288857f5ffcccbb2c4cef10e7632749f324828440ddf"} Apr 17 10:22:14.971005 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:14.970670 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:22:14.985602 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:14.985544 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" podStartSLOduration=1.465937507 podStartE2EDuration="3.985528151s" podCreationTimestamp="2026-04-17 10:22:11 +0000 UTC" firstStartedPulling="2026-04-17 10:22:11.940871954 +0000 UTC m=+291.677058285" lastFinishedPulling="2026-04-17 10:22:14.460462583 +0000 UTC m=+294.196648929" observedRunningTime="2026-04-17 10:22:14.985023611 +0000 UTC m=+294.721209966" watchObservedRunningTime="2026-04-17 10:22:14.985528151 +0000 UTC m=+294.721714503" Apr 17 10:22:20.786061 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:20.786035 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:22:20.786486 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:20.786339 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:22:20.789124 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:20.789104 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 10:22:25.979518 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:22:25.979484 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-55f595dffd-wst4h" Apr 17 10:23:58.998766 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:58.998731 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67b56bcb54-qb2mf"] Apr 17 10:23:59.002116 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.002091 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.014507 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.014480 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67b56bcb54-qb2mf"] Apr 17 10:23:59.032876 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.032840 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-trusted-ca-bundle\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.033032 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.032892 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d227e23b-d414-4bae-adf6-8dae2b965019-console-oauth-config\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.033032 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.032955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkrlp\" (UniqueName: \"kubernetes.io/projected/d227e23b-d414-4bae-adf6-8dae2b965019-kube-api-access-gkrlp\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.033032 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.033015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-service-ca\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.033164 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.033039 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-console-config\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.033164 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.033056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-oauth-serving-cert\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.033164 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.033131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d227e23b-d414-4bae-adf6-8dae2b965019-console-serving-cert\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.134226 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.134196 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-trusted-ca-bundle\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.134382 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.134243 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d227e23b-d414-4bae-adf6-8dae2b965019-console-oauth-config\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.134442 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.134399 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkrlp\" (UniqueName: \"kubernetes.io/projected/d227e23b-d414-4bae-adf6-8dae2b965019-kube-api-access-gkrlp\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.134497 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.134475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-service-ca\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.134553 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.134514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-console-config\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.134553 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.134532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-oauth-serving-cert\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.134553 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.134551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d227e23b-d414-4bae-adf6-8dae2b965019-console-serving-cert\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.135216 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.135189 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-service-ca\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.135216 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.135211 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-console-config\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.135400 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.135195 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-oauth-serving-cert\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.135400 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.135191 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d227e23b-d414-4bae-adf6-8dae2b965019-trusted-ca-bundle\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.136706 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.136687 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d227e23b-d414-4bae-adf6-8dae2b965019-console-oauth-config\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.137028 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.137009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d227e23b-d414-4bae-adf6-8dae2b965019-console-serving-cert\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.142794 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.142774 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkrlp\" (UniqueName: \"kubernetes.io/projected/d227e23b-d414-4bae-adf6-8dae2b965019-kube-api-access-gkrlp\") pod \"console-67b56bcb54-qb2mf\" (UID: \"d227e23b-d414-4bae-adf6-8dae2b965019\") " pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.312578 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.312536 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:23:59.438384 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.438355 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67b56bcb54-qb2mf"] Apr 17 10:23:59.440534 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:23:59.440501 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd227e23b_d414_4bae_adf6_8dae2b965019.slice/crio-0e59598f226c26df78dfe4430a18357daac3c5ad58a6991d0a228c5cc49835ed WatchSource:0}: Error finding container 0e59598f226c26df78dfe4430a18357daac3c5ad58a6991d0a228c5cc49835ed: Status 404 returned error can't find the container with id 0e59598f226c26df78dfe4430a18357daac3c5ad58a6991d0a228c5cc49835ed Apr 17 10:23:59.442429 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:23:59.442399 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 10:24:00.345610 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:00.345569 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67b56bcb54-qb2mf" event={"ID":"d227e23b-d414-4bae-adf6-8dae2b965019","Type":"ContainerStarted","Data":"94e6360a14e91f83d7df61078f43d14acf39524f36940c868a9ce162b9e1a97c"} Apr 17 10:24:00.345997 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:00.345616 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67b56bcb54-qb2mf" event={"ID":"d227e23b-d414-4bae-adf6-8dae2b965019","Type":"ContainerStarted","Data":"0e59598f226c26df78dfe4430a18357daac3c5ad58a6991d0a228c5cc49835ed"} Apr 17 10:24:00.363182 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:00.363135 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67b56bcb54-qb2mf" podStartSLOduration=2.363121634 podStartE2EDuration="2.363121634s" podCreationTimestamp="2026-04-17 10:23:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 10:24:00.360660295 +0000 UTC m=+400.096846649" watchObservedRunningTime="2026-04-17 10:24:00.363121634 +0000 UTC m=+400.099307986" Apr 17 10:24:09.313093 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:09.313054 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:24:09.313093 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:09.313101 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:24:09.317905 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:09.317882 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:24:09.378778 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:09.378748 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67b56bcb54-qb2mf" Apr 17 10:24:09.424205 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:09.424168 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b7c9cbb88-bbfdl"] Apr 17 10:24:34.446203 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.446096 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b7c9cbb88-bbfdl" podUID="0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" containerName="console" containerID="cri-o://626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f" gracePeriod=15 Apr 17 10:24:34.574483 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.574446 2579 patch_prober.go:28] interesting pod/console-7b7c9cbb88-bbfdl container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.132.0.26:8443/health\": dial tcp 10.132.0.26:8443: connect: connection refused" start-of-body= Apr 17 10:24:34.574652 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.574528 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-7b7c9cbb88-bbfdl" podUID="0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" containerName="console" probeResult="failure" output="Get \"https://10.132.0.26:8443/health\": dial tcp 10.132.0.26:8443: connect: connection refused" Apr 17 10:24:34.683775 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.683751 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b7c9cbb88-bbfdl_0e82d377-1ba5-4e0d-8823-e3b577dfd1c4/console/0.log" Apr 17 10:24:34.683909 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.683815 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:24:34.746317 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.746228 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-oauth-config\") pod \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " Apr 17 10:24:34.746317 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.746270 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-service-ca\") pod \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " Apr 17 10:24:34.746317 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.746304 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-serving-cert\") pod \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " Apr 17 10:24:34.746601 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.746480 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-oauth-serving-cert\") pod \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " Apr 17 10:24:34.746601 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.746544 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-trusted-ca-bundle\") pod \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " Apr 17 10:24:34.746601 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.746592 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvc2l\" (UniqueName: \"kubernetes.io/projected/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-kube-api-access-bvc2l\") pod \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " Apr 17 10:24:34.746763 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.746623 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-config\") pod \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\" (UID: \"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4\") " Apr 17 10:24:34.746925 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.746871 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" (UID: "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 10:24:34.747034 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.746998 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-oauth-serving-cert\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:24:34.747095 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.746993 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" (UID: "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 10:24:34.747162 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.747138 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-config" (OuterVolumeSpecName: "console-config") pod "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" (UID: "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 10:24:34.747200 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.747185 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-service-ca" (OuterVolumeSpecName: "service-ca") pod "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" (UID: "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 10:24:34.748553 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.748530 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" (UID: "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 10:24:34.748654 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.748579 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" (UID: "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 10:24:34.748654 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.748628 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-kube-api-access-bvc2l" (OuterVolumeSpecName: "kube-api-access-bvc2l") pod "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" (UID: "0e82d377-1ba5-4e0d-8823-e3b577dfd1c4"). InnerVolumeSpecName "kube-api-access-bvc2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 10:24:34.847475 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.847401 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-serving-cert\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:24:34.847475 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.847469 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-trusted-ca-bundle\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:24:34.847475 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.847479 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bvc2l\" (UniqueName: \"kubernetes.io/projected/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-kube-api-access-bvc2l\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:24:34.847703 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.847489 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-config\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:24:34.847703 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.847499 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-console-oauth-config\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:24:34.847703 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:34.847507 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4-service-ca\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:24:35.466962 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:35.466927 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b7c9cbb88-bbfdl_0e82d377-1ba5-4e0d-8823-e3b577dfd1c4/console/0.log" Apr 17 10:24:35.467463 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:35.466973 2579 generic.go:358] "Generic (PLEG): container finished" podID="0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" containerID="626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f" exitCode=2 Apr 17 10:24:35.467463 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:35.467004 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b7c9cbb88-bbfdl" event={"ID":"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4","Type":"ContainerDied","Data":"626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f"} Apr 17 10:24:35.467463 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:35.467032 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b7c9cbb88-bbfdl" event={"ID":"0e82d377-1ba5-4e0d-8823-e3b577dfd1c4","Type":"ContainerDied","Data":"417e1a39e5a3094b88486633f53d4e10807899eff66a53f68e454f05498fc38c"} Apr 17 10:24:35.467463 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:35.467033 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b7c9cbb88-bbfdl" Apr 17 10:24:35.467463 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:35.467052 2579 scope.go:117] "RemoveContainer" containerID="626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f" Apr 17 10:24:35.475140 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:35.475122 2579 scope.go:117] "RemoveContainer" containerID="626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f" Apr 17 10:24:35.475392 ip-10-0-128-180 kubenswrapper[2579]: E0417 10:24:35.475376 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f\": container with ID starting with 626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f not found: ID does not exist" containerID="626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f" Apr 17 10:24:35.475511 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:35.475403 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f"} err="failed to get container status \"626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f\": rpc error: code = NotFound desc = could not find container \"626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f\": container with ID starting with 626c9b7086314e637564c0f28916b1e952e15474cff8338e4156d42376fa2b0f not found: ID does not exist" Apr 17 10:24:35.483912 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:35.483884 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b7c9cbb88-bbfdl"] Apr 17 10:24:35.487168 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:35.487142 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b7c9cbb88-bbfdl"] Apr 17 10:24:36.868538 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:24:36.868500 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" path="/var/lib/kubelet/pods/0e82d377-1ba5-4e0d-8823-e3b577dfd1c4/volumes" Apr 17 10:27:20.820037 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:27:20.820002 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:27:20.820549 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:27:20.820438 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:32:20.850980 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:32:20.850945 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:32:20.851538 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:32:20.851384 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:37:20.883892 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:37:20.883855 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:37:20.886291 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:37:20.885567 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:42:20.919843 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:42:20.919731 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:42:20.923682 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:42:20.921658 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:47:20.944392 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:47:20.944271 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:47:20.950996 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:47:20.947758 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:52:20.968770 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:52:20.968656 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:52:20.973495 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:52:20.973475 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:56:39.789653 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:39.789613 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2"] Apr 17 10:56:39.790130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:39.789963 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" containerName="console" Apr 17 10:56:39.790130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:39.789974 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" containerName="console" Apr 17 10:56:39.790130 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:39.790047 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e82d377-1ba5-4e0d-8823-e3b577dfd1c4" containerName="console" Apr 17 10:56:39.792976 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:39.792959 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:39.795279 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:39.795253 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-wshsg\"/\"kube-root-ca.crt\"" Apr 17 10:56:39.795409 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:39.795253 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-wshsg\"/\"default-dockercfg-hm8vm\"" Apr 17 10:56:39.796043 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:39.796028 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-wshsg\"/\"openshift-service-ca.crt\"" Apr 17 10:56:39.799690 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:39.799652 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2"] Apr 17 10:56:39.957872 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:39.957832 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkw76\" (UniqueName: \"kubernetes.io/projected/c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb-kube-api-access-zkw76\") pod \"test-trainjob-jnd7b-dataset-initializer-0-0-df8d2\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") " pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:40.058788 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:40.058696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkw76\" (UniqueName: \"kubernetes.io/projected/c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb-kube-api-access-zkw76\") pod \"test-trainjob-jnd7b-dataset-initializer-0-0-df8d2\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") " pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:40.066305 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:40.066278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkw76\" (UniqueName: \"kubernetes.io/projected/c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb-kube-api-access-zkw76\") pod \"test-trainjob-jnd7b-dataset-initializer-0-0-df8d2\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") " pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:47.624692 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:47.624649 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-dataset-initializer-0-0-df8d2\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") " pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:47.629763 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:47.629736 2579 operation_generator.go:1469] "Controller attach succeeded for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-dataset-initializer-0-0-df8d2\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") device path: \"\"" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:47.725105 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:47.725063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-dataset-initializer-0-0-df8d2\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") " pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:47.725314 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:47.725213 2579 operation_generator.go:515] "MountVolume.WaitForAttach entering for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-dataset-initializer-0-0-df8d2\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") DevicePath \"\"" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:47.727704 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:47.727676 2579 operation_generator.go:525] "MountVolume.WaitForAttach succeeded for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-dataset-initializer-0-0-df8d2\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") DevicePath \"csi-7fdbf44c58426f9e5c0919d96524649aa6e242df190fc5cdd424c4419d386d96\"" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:47.955885 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:47.955849 2579 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-dataset-initializer-0-0-df8d2\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/ebs.csi.aws.com/784aa42007583401c6a26cbc39ac2e4f9cb93752e1936e677ecf942d0b754944/globalmount\"" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:47.969979 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:47.969949 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-dataset-initializer-0-0-df8d2\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") " pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:48.203845 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:48.203801 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:48.330776 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:48.330740 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2"] Apr 17 10:56:48.334288 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:56:48.334255 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc688f5a6_6c00_4d11_aa4d_cf0f08c9e4eb.slice/crio-29a91cfbcdf454e279fa00fb42ff539f808587ef957e8af7cf7c87e56e711f83 WatchSource:0}: Error finding container 29a91cfbcdf454e279fa00fb42ff539f808587ef957e8af7cf7c87e56e711f83: Status 404 returned error can't find the container with id 29a91cfbcdf454e279fa00fb42ff539f808587ef957e8af7cf7c87e56e711f83 Apr 17 10:56:48.336678 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:48.336661 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 10:56:49.125809 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:49.125765 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" event={"ID":"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb","Type":"ContainerStarted","Data":"29a91cfbcdf454e279fa00fb42ff539f808587ef957e8af7cf7c87e56e711f83"} Apr 17 10:56:53.143964 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:53.143926 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" event={"ID":"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb","Type":"ContainerStarted","Data":"c2f4b13a66d16e0d51bd7a34cfce9e63ad8cd3004367b5a8408f3faac1098173"} Apr 17 10:56:53.165075 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:53.165018 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" podStartSLOduration=13.04225599 podStartE2EDuration="17.165003083s" podCreationTimestamp="2026-04-17 10:56:36 +0000 UTC" firstStartedPulling="2026-04-17 10:56:48.336786357 +0000 UTC m=+2368.072972689" lastFinishedPulling="2026-04-17 10:56:52.459533443 +0000 UTC m=+2372.195719782" observedRunningTime="2026-04-17 10:56:53.1627461 +0000 UTC m=+2372.898932464" watchObservedRunningTime="2026-04-17 10:56:53.165003083 +0000 UTC m=+2372.901189436" Apr 17 10:56:58.162005 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:58.161965 2579 generic.go:358] "Generic (PLEG): container finished" podID="c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb" containerID="c2f4b13a66d16e0d51bd7a34cfce9e63ad8cd3004367b5a8408f3faac1098173" exitCode=0 Apr 17 10:56:58.162516 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:58.162041 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" event={"ID":"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb","Type":"ContainerDied","Data":"c2f4b13a66d16e0d51bd7a34cfce9e63ad8cd3004367b5a8408f3faac1098173"} Apr 17 10:56:59.290260 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:59.290235 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:56:59.438587 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:59.438506 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workspace\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") " Apr 17 10:56:59.438587 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:59.438576 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkw76\" (UniqueName: \"kubernetes.io/projected/c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb-kube-api-access-zkw76\") pod \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\" (UID: \"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb\") " Apr 17 10:56:59.440816 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:59.440786 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb-kube-api-access-zkw76" (OuterVolumeSpecName: "kube-api-access-zkw76") pod "c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb" (UID: "c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb"). InnerVolumeSpecName "kube-api-access-zkw76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 10:56:59.441538 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:59.441517 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78" (OuterVolumeSpecName: "workspace") pod "c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb" (UID: "c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb"). InnerVolumeSpecName "pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6". PluginName "kubernetes.io/csi", VolumeGIDValue "" Apr 17 10:56:59.539597 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:59.539562 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkw76\" (UniqueName: \"kubernetes.io/projected/c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb-kube-api-access-zkw76\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:56:59.539769 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:59.539615 2579 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") on node \"ip-10-0-128-180.ec2.internal\" " Apr 17 10:56:59.556478 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:59.556456 2579 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6" (UniqueName: "kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78") on node "ip-10-0-128-180.ec2.internal" Apr 17 10:56:59.640444 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:56:59.640392 2579 reconciler_common.go:299] "Volume detached for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"csi-7fdbf44c58426f9e5c0919d96524649aa6e242df190fc5cdd424c4419d386d96\"" Apr 17 10:57:00.170443 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.170396 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" Apr 17 10:57:00.170650 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.170397 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2" event={"ID":"c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb","Type":"ContainerDied","Data":"29a91cfbcdf454e279fa00fb42ff539f808587ef957e8af7cf7c87e56e711f83"} Apr 17 10:57:00.170650 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.170528 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29a91cfbcdf454e279fa00fb42ff539f808587ef957e8af7cf7c87e56e711f83" Apr 17 10:57:00.249587 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.249551 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl"] Apr 17 10:57:00.249991 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.249974 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb" containerName="dataset-initializer" Apr 17 10:57:00.250082 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.249994 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb" containerName="dataset-initializer" Apr 17 10:57:00.250135 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.250107 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb" containerName="dataset-initializer" Apr 17 10:57:00.257361 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.257324 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:00.260095 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.259922 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-wshsg\"/\"default-dockercfg-hm8vm\"" Apr 17 10:57:00.260095 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.259981 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-wshsg\"/\"kube-root-ca.crt\"" Apr 17 10:57:00.260095 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.260008 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-wshsg\"/\"openshift-service-ca.crt\"" Apr 17 10:57:00.260431 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.260397 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl"] Apr 17 10:57:00.346472 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.346432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26zgm\" (UniqueName: \"kubernetes.io/projected/8b3e8919-bbf4-423f-b11c-36b18f83730e-kube-api-access-26zgm\") pod \"test-trainjob-jnd7b-model-initializer-0-0-xxlpl\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") " pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:00.447220 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.447129 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26zgm\" (UniqueName: \"kubernetes.io/projected/8b3e8919-bbf4-423f-b11c-36b18f83730e-kube-api-access-26zgm\") pod \"test-trainjob-jnd7b-model-initializer-0-0-xxlpl\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") " pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:00.455438 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:00.455396 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26zgm\" (UniqueName: \"kubernetes.io/projected/8b3e8919-bbf4-423f-b11c-36b18f83730e-kube-api-access-26zgm\") pod \"test-trainjob-jnd7b-model-initializer-0-0-xxlpl\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") " pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:07.914315 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:07.914271 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-model-initializer-0-0-xxlpl\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") " pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:07.920762 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:07.920731 2579 operation_generator.go:1469] "Controller attach succeeded for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-model-initializer-0-0-xxlpl\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") device path: \"\"" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:08.014979 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:08.014936 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-model-initializer-0-0-xxlpl\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") " pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:08.015174 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:08.015106 2579 operation_generator.go:515] "MountVolume.WaitForAttach entering for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-model-initializer-0-0-xxlpl\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") DevicePath \"\"" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:08.018193 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:08.018160 2579 operation_generator.go:525] "MountVolume.WaitForAttach succeeded for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-model-initializer-0-0-xxlpl\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") DevicePath \"csi-7fdbf44c58426f9e5c0919d96524649aa6e242df190fc5cdd424c4419d386d96\"" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:08.140815 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:08.140781 2579 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-model-initializer-0-0-xxlpl\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/ebs.csi.aws.com/784aa42007583401c6a26cbc39ac2e4f9cb93752e1936e677ecf942d0b754944/globalmount\"" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:08.156132 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:08.156097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"test-trainjob-jnd7b-model-initializer-0-0-xxlpl\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") " pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:08.369182 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:08.369143 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:57:08.498249 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:08.498222 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl"] Apr 17 10:57:08.500339 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:57:08.500313 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b3e8919_bbf4_423f_b11c_36b18f83730e.slice/crio-cb484bcc3388d226c897a8ed65d3367fce6addf018590aef87149c22d92ed7d6 WatchSource:0}: Error finding container cb484bcc3388d226c897a8ed65d3367fce6addf018590aef87149c22d92ed7d6: Status 404 returned error can't find the container with id cb484bcc3388d226c897a8ed65d3367fce6addf018590aef87149c22d92ed7d6 Apr 17 10:57:09.205246 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:09.205205 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" event={"ID":"8b3e8919-bbf4-423f-b11c-36b18f83730e","Type":"ContainerStarted","Data":"cb484bcc3388d226c897a8ed65d3367fce6addf018590aef87149c22d92ed7d6"} Apr 17 10:57:20.998084 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:20.997972 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:57:21.015341 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:57:21.002019 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 10:58:11.437869 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:11.437825 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" event={"ID":"8b3e8919-bbf4-423f-b11c-36b18f83730e","Type":"ContainerStarted","Data":"61424f0c55a24ca5066e868b208f94c98f2372e75aa5d0402f260e890bec75d7"} Apr 17 10:58:11.456550 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:11.454903 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" podStartSLOduration=8.648047163 podStartE2EDuration="1m11.45488391s" podCreationTimestamp="2026-04-17 10:57:00 +0000 UTC" firstStartedPulling="2026-04-17 10:57:08.502322474 +0000 UTC m=+2388.238508804" lastFinishedPulling="2026-04-17 10:58:11.309159207 +0000 UTC m=+2451.045345551" observedRunningTime="2026-04-17 10:58:11.452179534 +0000 UTC m=+2451.188365890" watchObservedRunningTime="2026-04-17 10:58:11.45488391 +0000 UTC m=+2451.191070265" Apr 17 10:58:22.478293 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:22.478255 2579 generic.go:358] "Generic (PLEG): container finished" podID="8b3e8919-bbf4-423f-b11c-36b18f83730e" containerID="61424f0c55a24ca5066e868b208f94c98f2372e75aa5d0402f260e890bec75d7" exitCode=0 Apr 17 10:58:22.478719 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:22.478306 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" event={"ID":"8b3e8919-bbf4-423f-b11c-36b18f83730e","Type":"ContainerDied","Data":"61424f0c55a24ca5066e868b208f94c98f2372e75aa5d0402f260e890bec75d7"} Apr 17 10:58:23.608300 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:23.608277 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:58:23.695283 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:23.695250 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workspace\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") pod \"8b3e8919-bbf4-423f-b11c-36b18f83730e\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") " Apr 17 10:58:23.695471 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:23.695348 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26zgm\" (UniqueName: \"kubernetes.io/projected/8b3e8919-bbf4-423f-b11c-36b18f83730e-kube-api-access-26zgm\") pod \"8b3e8919-bbf4-423f-b11c-36b18f83730e\" (UID: \"8b3e8919-bbf4-423f-b11c-36b18f83730e\") " Apr 17 10:58:23.697576 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:23.697544 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3e8919-bbf4-423f-b11c-36b18f83730e-kube-api-access-26zgm" (OuterVolumeSpecName: "kube-api-access-26zgm") pod "8b3e8919-bbf4-423f-b11c-36b18f83730e" (UID: "8b3e8919-bbf4-423f-b11c-36b18f83730e"). InnerVolumeSpecName "kube-api-access-26zgm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 10:58:23.698129 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:23.698109 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78" (OuterVolumeSpecName: "workspace") pod "8b3e8919-bbf4-423f-b11c-36b18f83730e" (UID: "8b3e8919-bbf4-423f-b11c-36b18f83730e"). InnerVolumeSpecName "pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6". PluginName "kubernetes.io/csi", VolumeGIDValue "" Apr 17 10:58:23.796678 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:23.796639 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26zgm\" (UniqueName: \"kubernetes.io/projected/8b3e8919-bbf4-423f-b11c-36b18f83730e-kube-api-access-26zgm\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:58:23.796847 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:23.796696 2579 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") on node \"ip-10-0-128-180.ec2.internal\" " Apr 17 10:58:23.860253 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:23.860232 2579 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6" (UniqueName: "kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78") on node "ip-10-0-128-180.ec2.internal" Apr 17 10:58:23.897477 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:23.897402 2579 reconciler_common.go:299] "Volume detached for volume \"pvc-7e04d169-e7bc-4194-bb9b-85d31a86a8b6\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0ed997194a4d9bb78\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"csi-7fdbf44c58426f9e5c0919d96524649aa6e242df190fc5cdd424c4419d386d96\"" Apr 17 10:58:24.486209 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:24.486174 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" event={"ID":"8b3e8919-bbf4-423f-b11c-36b18f83730e","Type":"ContainerDied","Data":"cb484bcc3388d226c897a8ed65d3367fce6addf018590aef87149c22d92ed7d6"} Apr 17 10:58:24.486209 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:24.486198 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl" Apr 17 10:58:24.486209 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:24.486209 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb484bcc3388d226c897a8ed65d3367fce6addf018590aef87149c22d92ed7d6" Apr 17 10:58:42.537507 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:42.537470 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-wshsg_test-trainjob-jnd7b-dataset-initializer-0-0-df8d2_c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb/dataset-initializer/0.log" Apr 17 10:58:42.545251 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:42.545219 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-wshsg_test-trainjob-jnd7b-model-initializer-0-0-xxlpl_8b3e8919-bbf4-423f-b11c-36b18f83730e/model-initializer/0.log" Apr 17 10:58:45.760915 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.760874 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg"] Apr 17 10:58:45.761289 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.761230 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b3e8919-bbf4-423f-b11c-36b18f83730e" containerName="model-initializer" Apr 17 10:58:45.761289 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.761241 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3e8919-bbf4-423f-b11c-36b18f83730e" containerName="model-initializer" Apr 17 10:58:45.761362 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.761303 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b3e8919-bbf4-423f-b11c-36b18f83730e" containerName="model-initializer" Apr 17 10:58:45.764206 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.764189 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:45.766453 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.766407 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"test-ns-2whsd\"/\"default-dockercfg-kjxl9\"" Apr 17 10:58:45.766453 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.766448 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-2whsd\"/\"openshift-service-ca.crt\"" Apr 17 10:58:45.767239 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.767223 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"test-ns-2whsd\"/\"kube-root-ca.crt\"" Apr 17 10:58:45.773621 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.773583 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg"] Apr 17 10:58:45.897389 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.897357 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g69ds\" (UniqueName: \"kubernetes.io/projected/27d461b3-a42b-40a1-abf6-97361b6000fb-kube-api-access-g69ds\") pod \"test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") " pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:45.998687 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:45.998649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g69ds\" (UniqueName: \"kubernetes.io/projected/27d461b3-a42b-40a1-abf6-97361b6000fb-kube-api-access-g69ds\") pod \"test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") " pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:46.006777 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:46.006747 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g69ds\" (UniqueName: \"kubernetes.io/projected/27d461b3-a42b-40a1-abf6-97361b6000fb-kube-api-access-g69ds\") pod \"test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") " pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:47.596387 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:47.596348 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2"] Apr 17 10:58:47.599577 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:47.599548 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-wshsg/test-trainjob-jnd7b-dataset-initializer-0-0-df8d2"] Apr 17 10:58:47.606488 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:47.606462 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl"] Apr 17 10:58:47.611584 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:47.611561 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-wshsg/test-trainjob-jnd7b-model-initializer-0-0-xxlpl"] Apr 17 10:58:48.868588 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:48.868551 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3e8919-bbf4-423f-b11c-36b18f83730e" path="/var/lib/kubelet/pods/8b3e8919-bbf4-423f-b11c-36b18f83730e/volumes" Apr 17 10:58:48.869239 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:48.868923 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb" path="/var/lib/kubelet/pods/c688f5a6-6c00-4d11-aa4d-cf0f08c9e4eb/volumes" Apr 17 10:58:49.933506 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:49.933469 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-43561e99-a34c-4f31-99ce-10a121393aa0\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93\") pod \"test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") " pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:49.938256 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:49.938228 2579 operation_generator.go:1469] "Controller attach succeeded for volume \"pvc-43561e99-a34c-4f31-99ce-10a121393aa0\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93\") pod \"test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") device path: \"\"" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:50.034304 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:50.034267 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"pvc-43561e99-a34c-4f31-99ce-10a121393aa0\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93\") pod \"test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") " pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:50.034512 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:50.034439 2579 operation_generator.go:515] "MountVolume.WaitForAttach entering for volume \"pvc-43561e99-a34c-4f31-99ce-10a121393aa0\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93\") pod \"test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") DevicePath \"\"" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:50.037538 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:50.037516 2579 operation_generator.go:525] "MountVolume.WaitForAttach succeeded for volume \"pvc-43561e99-a34c-4f31-99ce-10a121393aa0\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93\") pod \"test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") DevicePath \"csi-25ac856e3409621720b5040a7e022192cd3794502023fc5f0fe089b6cfe926a2\"" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:50.201716 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:50.201681 2579 operation_generator.go:557] "MountVolume.MountDevice succeeded for volume \"pvc-43561e99-a34c-4f31-99ce-10a121393aa0\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93\") pod \"test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/ebs.csi.aws.com/7dd6d8c01afd21843fc7c416f2574587a76386b51b40b4d2d1c68bf02dbccdc8/globalmount\"" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:50.215149 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:50.215119 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"pvc-43561e99-a34c-4f31-99ce-10a121393aa0\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93\") pod \"test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") " pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:50.281133 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:50.281096 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:50.406376 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:50.406347 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg"] Apr 17 10:58:50.408908 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:58:50.408869 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27d461b3_a42b_40a1_abf6_97361b6000fb.slice/crio-6b06e6ff08adabc0c65af0a1b13050230ef5ec7caa5a26a4380a2bd7a9128ea5 WatchSource:0}: Error finding container 6b06e6ff08adabc0c65af0a1b13050230ef5ec7caa5a26a4380a2bd7a9128ea5: Status 404 returned error can't find the container with id 6b06e6ff08adabc0c65af0a1b13050230ef5ec7caa5a26a4380a2bd7a9128ea5 Apr 17 10:58:50.583269 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:50.583238 2579 generic.go:358] "Generic (PLEG): container finished" podID="27d461b3-a42b-40a1-abf6-97361b6000fb" containerID="16718f25d7cf78ccf375ed180494ad5b8f63adac0553bea0868696aff20ca95e" exitCode=1 Apr 17 10:58:50.583463 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:50.583276 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" event={"ID":"27d461b3-a42b-40a1-abf6-97361b6000fb","Type":"ContainerDied","Data":"16718f25d7cf78ccf375ed180494ad5b8f63adac0553bea0868696aff20ca95e"} Apr 17 10:58:50.583463 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:50.583317 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" event={"ID":"27d461b3-a42b-40a1-abf6-97361b6000fb","Type":"ContainerStarted","Data":"6b06e6ff08adabc0c65af0a1b13050230ef5ec7caa5a26a4380a2bd7a9128ea5"} Apr 17 10:58:51.707004 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:51.706980 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:51.852513 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:51.852392 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g69ds\" (UniqueName: \"kubernetes.io/projected/27d461b3-a42b-40a1-abf6-97361b6000fb-kube-api-access-g69ds\") pod \"27d461b3-a42b-40a1-abf6-97361b6000fb\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") " Apr 17 10:58:51.852679 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:51.852598 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workspace\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93\") pod \"27d461b3-a42b-40a1-abf6-97361b6000fb\" (UID: \"27d461b3-a42b-40a1-abf6-97361b6000fb\") " Apr 17 10:58:51.854870 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:51.854841 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d461b3-a42b-40a1-abf6-97361b6000fb-kube-api-access-g69ds" (OuterVolumeSpecName: "kube-api-access-g69ds") pod "27d461b3-a42b-40a1-abf6-97361b6000fb" (UID: "27d461b3-a42b-40a1-abf6-97361b6000fb"). InnerVolumeSpecName "kube-api-access-g69ds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 10:58:51.855392 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:51.855374 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93" (OuterVolumeSpecName: "workspace") pod "27d461b3-a42b-40a1-abf6-97361b6000fb" (UID: "27d461b3-a42b-40a1-abf6-97361b6000fb"). InnerVolumeSpecName "pvc-43561e99-a34c-4f31-99ce-10a121393aa0". PluginName "kubernetes.io/csi", VolumeGIDValue "" Apr 17 10:58:51.953479 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:51.953434 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g69ds\" (UniqueName: \"kubernetes.io/projected/27d461b3-a42b-40a1-abf6-97361b6000fb-kube-api-access-g69ds\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"\"" Apr 17 10:58:51.953677 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:51.953493 2579 reconciler_common.go:292] "operationExecutor.UnmountDevice started for volume \"pvc-43561e99-a34c-4f31-99ce-10a121393aa0\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93\") on node \"ip-10-0-128-180.ec2.internal\" " Apr 17 10:58:51.967387 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:51.967365 2579 operation_generator.go:895] UnmountDevice succeeded for volume "pvc-43561e99-a34c-4f31-99ce-10a121393aa0" (UniqueName: "kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93") on node "ip-10-0-128-180.ec2.internal" Apr 17 10:58:52.054483 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:52.054442 2579 reconciler_common.go:299] "Volume detached for volume \"pvc-43561e99-a34c-4f31-99ce-10a121393aa0\" (UniqueName: \"kubernetes.io/csi/ebs.csi.aws.com^vol-0be1ca2a52e2fbd93\") on node \"ip-10-0-128-180.ec2.internal\" DevicePath \"csi-25ac856e3409621720b5040a7e022192cd3794502023fc5f0fe089b6cfe926a2\"" Apr 17 10:58:52.591745 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:52.591707 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" event={"ID":"27d461b3-a42b-40a1-abf6-97361b6000fb","Type":"ContainerDied","Data":"6b06e6ff08adabc0c65af0a1b13050230ef5ec7caa5a26a4380a2bd7a9128ea5"} Apr 17 10:58:52.591745 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:52.591747 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b06e6ff08adabc0c65af0a1b13050230ef5ec7caa5a26a4380a2bd7a9128ea5" Apr 17 10:58:52.591949 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:52.591721 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg" Apr 17 10:58:53.732431 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:53.732398 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/test-ns-2whsd_test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg_27d461b3-a42b-40a1-abf6-97361b6000fb/dataset-initializer/0.log" Apr 17 10:58:58.762456 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:58.762403 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg"] Apr 17 10:58:58.766029 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:58.766002 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["test-ns-2whsd/test-trainjob-fail-gjjds-dataset-initializer-0-0-5clzg"] Apr 17 10:58:58.869399 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:58:58.869323 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27d461b3-a42b-40a1-abf6-97361b6000fb" path="/var/lib/kubelet/pods/27d461b3-a42b-40a1-abf6-97361b6000fb/volumes" Apr 17 10:59:48.580320 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:48.580280 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-g42w5_9e8f4822-a90d-4e05-81a2-704623b16ed2/global-pull-secret-syncer/0.log" Apr 17 10:59:48.694859 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:48.694829 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8pm8f_9ff2c5cc-5a1b-4207-b5da-e7cd37db94b0/konnectivity-agent/0.log" Apr 17 10:59:48.777790 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:48.777756 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-180.ec2.internal_63dc35addb937cfe54aeec10c17385f5/haproxy/0.log" Apr 17 10:59:51.727702 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:51.727620 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_94d89ba1-46d8-4c66-bf40-66f7dc002163/alertmanager/0.log" Apr 17 10:59:51.763512 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:51.763481 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_94d89ba1-46d8-4c66-bf40-66f7dc002163/config-reloader/0.log" Apr 17 10:59:51.794230 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:51.794200 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_94d89ba1-46d8-4c66-bf40-66f7dc002163/kube-rbac-proxy-web/0.log" Apr 17 10:59:51.817013 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:51.816987 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_94d89ba1-46d8-4c66-bf40-66f7dc002163/kube-rbac-proxy/0.log" Apr 17 10:59:51.842296 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:51.842272 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_94d89ba1-46d8-4c66-bf40-66f7dc002163/kube-rbac-proxy-metric/0.log" Apr 17 10:59:51.864263 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:51.864238 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_94d89ba1-46d8-4c66-bf40-66f7dc002163/prom-label-proxy/0.log" Apr 17 10:59:51.891354 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:51.891327 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_94d89ba1-46d8-4c66-bf40-66f7dc002163/init-config-reloader/0.log" Apr 17 10:59:51.923794 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:51.923766 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-ndgrn_5bf035ab-c3f4-4ce5-bd46-9b55d72879b7/cluster-monitoring-operator/0.log" Apr 17 10:59:52.017432 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.017339 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6f79cf884b-wdz2t_30850014-6153-4b95-b74e-6db1f8fa25e4/metrics-server/0.log" Apr 17 10:59:52.073329 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.073300 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j2j56_ebe2c4a6-9f94-45e4-9d13-c92fb307797f/node-exporter/0.log" Apr 17 10:59:52.093493 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.093462 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j2j56_ebe2c4a6-9f94-45e4-9d13-c92fb307797f/kube-rbac-proxy/0.log" Apr 17 10:59:52.116108 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.116081 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-j2j56_ebe2c4a6-9f94-45e4-9d13-c92fb307797f/init-textfile/0.log" Apr 17 10:59:52.572189 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.572158 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bdf54899-tlqjf_8fc6b947-b1ac-4dd3-ad0d-963849a98169/telemeter-client/0.log" Apr 17 10:59:52.590608 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.590581 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bdf54899-tlqjf_8fc6b947-b1ac-4dd3-ad0d-963849a98169/reload/0.log" Apr 17 10:59:52.611259 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.611228 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-bdf54899-tlqjf_8fc6b947-b1ac-4dd3-ad0d-963849a98169/kube-rbac-proxy/0.log" Apr 17 10:59:52.640913 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.640883 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d6dc7bc7f-5np9w_4908b461-b08b-4871-9ad8-b70af0813818/thanos-query/0.log" Apr 17 10:59:52.663236 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.663207 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d6dc7bc7f-5np9w_4908b461-b08b-4871-9ad8-b70af0813818/kube-rbac-proxy-web/0.log" Apr 17 10:59:52.681178 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.681152 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d6dc7bc7f-5np9w_4908b461-b08b-4871-9ad8-b70af0813818/kube-rbac-proxy/0.log" Apr 17 10:59:52.700216 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.700185 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d6dc7bc7f-5np9w_4908b461-b08b-4871-9ad8-b70af0813818/prom-label-proxy/0.log" Apr 17 10:59:52.720750 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.720724 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d6dc7bc7f-5np9w_4908b461-b08b-4871-9ad8-b70af0813818/kube-rbac-proxy-rules/0.log" Apr 17 10:59:52.738521 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:52.738491 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6d6dc7bc7f-5np9w_4908b461-b08b-4871-9ad8-b70af0813818/kube-rbac-proxy-metrics/0.log" Apr 17 10:59:53.853333 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:53.853300 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-nn582_ccb66962-e736-41cf-aff2-701b47e39579/networking-console-plugin/0.log" Apr 17 10:59:54.235481 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:54.235453 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-nskql_386a64c0-8bc9-4f79-b5ac-b26c8b1f4603/console-operator/0.log" Apr 17 10:59:54.555766 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:54.555736 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67b56bcb54-qb2mf_d227e23b-d414-4bae-adf6-8dae2b965019/console/0.log" Apr 17 10:59:54.960868 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:54.960793 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-ssrzt_60bcfb22-9d7f-4ed5-894f-3be5795bb21e/volume-data-source-validator/0.log" Apr 17 10:59:55.434242 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.434202 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl"] Apr 17 10:59:55.434592 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.434578 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27d461b3-a42b-40a1-abf6-97361b6000fb" containerName="dataset-initializer" Apr 17 10:59:55.434643 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.434593 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d461b3-a42b-40a1-abf6-97361b6000fb" containerName="dataset-initializer" Apr 17 10:59:55.434677 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.434650 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="27d461b3-a42b-40a1-abf6-97361b6000fb" containerName="dataset-initializer" Apr 17 10:59:55.438020 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.437994 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.440024 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.439998 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-62hp5\"/\"kube-root-ca.crt\"" Apr 17 10:59:55.440164 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.440068 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-62hp5\"/\"openshift-service-ca.crt\"" Apr 17 10:59:55.440762 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.440735 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-62hp5\"/\"default-dockercfg-tk6ql\"" Apr 17 10:59:55.447697 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.447672 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl"] Apr 17 10:59:55.507270 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.507232 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-sys\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.507270 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.507272 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-podres\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.507556 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.507303 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfhv8\" (UniqueName: \"kubernetes.io/projected/19770e95-f633-44b6-9027-48ba4805e9cc-kube-api-access-tfhv8\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.507556 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.507365 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-proc\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.507556 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.507390 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-lib-modules\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.542328 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.542303 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-27dzv_156b0fef-9a3a-4de8-81ea-13ac2cb45c19/dns/0.log" Apr 17 10:59:55.562809 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.562783 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-27dzv_156b0fef-9a3a-4de8-81ea-13ac2cb45c19/kube-rbac-proxy/0.log" Apr 17 10:59:55.607881 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.607843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-sys\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.607881 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.607878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-podres\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.608125 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.607928 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfhv8\" (UniqueName: \"kubernetes.io/projected/19770e95-f633-44b6-9027-48ba4805e9cc-kube-api-access-tfhv8\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.608125 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.607994 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-proc\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.608125 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.608033 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-lib-modules\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.608125 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.608035 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-podres\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.608125 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.608039 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-sys\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.608125 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.608077 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-proc\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.608314 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.608172 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19770e95-f633-44b6-9027-48ba4805e9cc-lib-modules\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.615267 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.615238 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfhv8\" (UniqueName: \"kubernetes.io/projected/19770e95-f633-44b6-9027-48ba4805e9cc-kube-api-access-tfhv8\") pod \"perf-node-gather-daemonset-f9xkl\" (UID: \"19770e95-f633-44b6-9027-48ba4805e9cc\") " pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.667617 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.667588 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6lwvd_730f5024-a732-4fc5-b09b-dab4adee9607/dns-node-resolver/0.log" Apr 17 10:59:55.749005 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.748919 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:55.868656 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:55.868505 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl"] Apr 17 10:59:55.871454 ip-10-0-128-180 kubenswrapper[2579]: W0417 10:59:55.871425 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod19770e95_f633_44b6_9027_48ba4805e9cc.slice/crio-875160a887c55d52f4cf56fdfb8a4cacd1d171bcff15768aead779b042ce3d8a WatchSource:0}: Error finding container 875160a887c55d52f4cf56fdfb8a4cacd1d171bcff15768aead779b042ce3d8a: Status 404 returned error can't find the container with id 875160a887c55d52f4cf56fdfb8a4cacd1d171bcff15768aead779b042ce3d8a Apr 17 10:59:56.123060 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:56.123029 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qlxxc_e8471c84-04de-44cd-a299-b4dc3d7003ab/node-ca/0.log" Apr 17 10:59:56.757641 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:56.757612 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-75b9fd94fb-6dvbt_210b0ad6-9a05-4f06-b86a-9dcf876a3ae5/router/0.log" Apr 17 10:59:56.832096 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:56.832056 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" event={"ID":"19770e95-f633-44b6-9027-48ba4805e9cc","Type":"ContainerStarted","Data":"21537085c4e3b6d7d11ca949b3dfb6879ea9c100b5d2523f2ec8b2e303e5b123"} Apr 17 10:59:56.832096 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:56.832101 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" event={"ID":"19770e95-f633-44b6-9027-48ba4805e9cc","Type":"ContainerStarted","Data":"875160a887c55d52f4cf56fdfb8a4cacd1d171bcff15768aead779b042ce3d8a"} Apr 17 10:59:56.832303 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:56.832197 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 10:59:56.847482 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:56.847429 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" podStartSLOduration=1.8473998360000001 podStartE2EDuration="1.847399836s" podCreationTimestamp="2026-04-17 10:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 10:59:56.845223955 +0000 UTC m=+2556.581410310" watchObservedRunningTime="2026-04-17 10:59:56.847399836 +0000 UTC m=+2556.583586189" Apr 17 10:59:57.056895 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:57.056860 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gbbkk_50cdff83-6754-4d62-8a0c-bfe31b13a7eb/serve-healthcheck-canary/0.log" Apr 17 10:59:57.376808 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:57.376723 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-5c64q_68dbc54c-11d7-4146-ac64-235b136dbc33/insights-operator/0.log" Apr 17 10:59:57.393235 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:57.393206 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4kssn_b7940a9f-1624-49aa-8779-367a5cc2092e/kube-rbac-proxy/0.log" Apr 17 10:59:57.410634 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:57.410605 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4kssn_b7940a9f-1624-49aa-8779-367a5cc2092e/exporter/0.log" Apr 17 10:59:57.429815 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:57.429785 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4kssn_b7940a9f-1624-49aa-8779-367a5cc2092e/extractor/0.log" Apr 17 10:59:59.074088 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:59.074056 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-55f595dffd-wst4h_12e6d84a-d886-4010-833b-673f093c07cf/manager/0.log" Apr 17 10:59:59.095825 ip-10-0-128-180 kubenswrapper[2579]: I0417 10:59:59.095797 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-gcn2w_c507b879-9e2a-412e-9f22-5413cf89d98f/jobset-operator/0.log" Apr 17 11:00:02.130244 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:02.130204 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-pwt5z_0970ce61-ac90-4b77-8b8b-3e0f732cce33/kube-storage-version-migrator-operator/0.log" Apr 17 11:00:02.845537 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:02.845506 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-62hp5/perf-node-gather-daemonset-f9xkl" Apr 17 11:00:03.159863 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:03.159779 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwf2j_c1879161-9d08-4f31-af72-d7e9b7b6b0c7/kube-multus-additional-cni-plugins/0.log" Apr 17 11:00:03.182501 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:03.182473 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwf2j_c1879161-9d08-4f31-af72-d7e9b7b6b0c7/egress-router-binary-copy/0.log" Apr 17 11:00:03.205482 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:03.205453 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwf2j_c1879161-9d08-4f31-af72-d7e9b7b6b0c7/cni-plugins/0.log" Apr 17 11:00:03.225445 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:03.225400 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwf2j_c1879161-9d08-4f31-af72-d7e9b7b6b0c7/bond-cni-plugin/0.log" Apr 17 11:00:03.244457 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:03.244402 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwf2j_c1879161-9d08-4f31-af72-d7e9b7b6b0c7/routeoverride-cni/0.log" Apr 17 11:00:03.263262 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:03.263231 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwf2j_c1879161-9d08-4f31-af72-d7e9b7b6b0c7/whereabouts-cni-bincopy/0.log" Apr 17 11:00:03.281040 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:03.281010 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jwf2j_c1879161-9d08-4f31-af72-d7e9b7b6b0c7/whereabouts-cni/0.log" Apr 17 11:00:03.332627 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:03.332596 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v8cbj_c8515f25-7800-40c5-a053-06805e22f902/kube-multus/0.log" Apr 17 11:00:03.349629 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:03.349598 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6q4fn_964ce349-102e-4852-b1d8-a4c58565a4a0/network-metrics-daemon/0.log" Apr 17 11:00:03.366593 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:03.366560 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6q4fn_964ce349-102e-4852-b1d8-a4c58565a4a0/kube-rbac-proxy/0.log" Apr 17 11:00:04.214589 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:04.214559 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-controller/0.log" Apr 17 11:00:04.236150 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:04.236116 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/0.log" Apr 17 11:00:04.247657 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:04.247622 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovn-acl-logging/1.log" Apr 17 11:00:04.265839 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:04.265756 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/kube-rbac-proxy-node/0.log" Apr 17 11:00:04.287611 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:04.287576 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 11:00:04.307066 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:04.307035 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/northd/0.log" Apr 17 11:00:04.324655 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:04.324608 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/nbdb/0.log" Apr 17 11:00:04.345012 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:04.344980 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/sbdb/0.log" Apr 17 11:00:04.431429 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:04.431391 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ggsw6_9e6ec344-3806-4357-8d4e-229cc8d0f613/ovnkube-controller/0.log" Apr 17 11:00:05.849533 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:05.849504 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-vf6k6_7715462b-006b-40d6-ba3b-5b84dfb6caeb/check-endpoints/0.log" Apr 17 11:00:05.916478 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:05.916447 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xzjpm_775480d3-f4fc-4837-9404-54348aa3f205/network-check-target-container/0.log" Apr 17 11:00:06.753167 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:06.753066 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-pg58d_f8ebcb33-32f1-469e-a5f3-c80aa250dc88/iptables-alerter/0.log" Apr 17 11:00:07.306383 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:07.306349 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-hdfrm_49bcd63d-9323-4368-9c4c-5edfc7b371f7/tuned/0.log" Apr 17 11:00:08.834509 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:08.834468 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-h4frw_13e5ced9-25b8-4b3f-9a0d-ac11761a6140/cluster-samples-operator/0.log" Apr 17 11:00:08.849500 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:08.849471 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-h4frw_13e5ced9-25b8-4b3f-9a0d-ac11761a6140/cluster-samples-operator-watch/0.log" Apr 17 11:00:09.737407 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:09.737372 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-fhlm5_0837740f-2592-494a-bec7-ad834d8b557e/service-ca-operator/1.log" Apr 17 11:00:09.738332 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:09.738310 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-fhlm5_0837740f-2592-494a-bec7-ad834d8b557e/service-ca-operator/0.log" Apr 17 11:00:10.000193 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:10.000108 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-5h75w_4acfea86-c51d-4bc7-a81a-fe43884aca68/service-ca-controller/0.log" Apr 17 11:00:10.422543 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:10.422506 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9v4bq_7ec60248-7d9c-4930-984a-1104f6a4e7a0/csi-driver/0.log" Apr 17 11:00:10.440679 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:10.440653 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9v4bq_7ec60248-7d9c-4930-984a-1104f6a4e7a0/csi-node-driver-registrar/0.log" Apr 17 11:00:10.458156 ip-10-0-128-180 kubenswrapper[2579]: I0417 11:00:10.458129 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-9v4bq_7ec60248-7d9c-4930-984a-1104f6a4e7a0/csi-liveness-probe/0.log"