Apr 22 18:44:10.828973 ip-10-0-134-109 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:44:10.828984 ip-10-0-134-109 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:44:10.828994 ip-10-0-134-109 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:44:10.829261 ip-10-0-134-109 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:44:20.860320 ip-10-0-134-109 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:44:20.860344 ip-10-0-134-109 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot de1851bd575447429432e0a508de4b4d -- Apr 22 18:46:42.825620 ip-10-0-134-109 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:43.255474 ip-10-0-134-109 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:43.255474 ip-10-0-134-109 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:43.255474 ip-10-0-134-109 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:43.255474 ip-10-0-134-109 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:43.255474 ip-10-0-134-109 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:43.257229 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.257114 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:43.264414 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264388 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:43.264414 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264409 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:43.264414 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264413 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:43.264414 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264416 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:43.264414 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264420 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:43.264414 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264424 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264428 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264432 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264435 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264438 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264441 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264444 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264446 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264449 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264451 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264454 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264457 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264460 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264462 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264465 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264467 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264470 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264472 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264475 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:43.264654 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264481 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264484 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264486 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264489 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264492 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264495 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264497 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264500 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264502 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264505 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264507 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264510 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264513 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264516 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264519 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264523 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264526 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264528 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264532 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264535 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:43.265112 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264537 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264540 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264542 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264545 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264548 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264550 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264555 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264560 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264563 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264566 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264569 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264571 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264574 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264577 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264580 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264583 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264586 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264588 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264591 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264593 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:43.265628 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264597 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264600 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264602 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264605 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264607 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264610 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264613 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264616 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264619 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264622 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264624 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264628 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264632 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264635 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264639 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264642 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264645 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264648 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264651 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:43.266158 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264654 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264657 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.264660 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265070 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265075 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265078 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265081 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265084 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265087 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265090 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265093 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265096 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265099 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265102 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265105 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265108 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265110 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265114 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265118 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265121 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:43.266643 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265125 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265127 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265130 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265133 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265135 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265138 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265140 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265144 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265147 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265149 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265152 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265155 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265158 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265160 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265163 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265165 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265168 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265170 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265173 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265175 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:43.267129 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265178 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265180 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265182 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265185 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265187 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265190 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265192 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265195 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265198 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265200 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265218 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265221 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265225 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265227 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265230 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265232 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265235 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265238 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265241 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265244 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:43.267640 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265247 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265250 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265252 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265255 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265258 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265260 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265263 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265266 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265268 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265271 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265275 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265278 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265281 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265283 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265287 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265289 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265292 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265294 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265297 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:43.268141 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265300 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265303 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265305 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265308 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265311 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265314 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265317 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265319 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265322 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.265324 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265404 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265411 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265419 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265423 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265430 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265433 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265438 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265447 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265451 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265454 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265457 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:43.268623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265461 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265464 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265467 2575 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265469 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265472 2575 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265475 2575 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265478 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265481 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265486 2575 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265489 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265492 2575 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265496 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265500 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265504 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265507 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265510 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265514 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265517 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265520 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265523 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265526 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265529 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265533 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265536 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265538 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:43.269125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265541 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265546 2575 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265549 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265553 2575 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265556 2575 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265559 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265562 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265565 2575 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265569 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265572 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265575 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265578 2575 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265581 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265583 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265586 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265589 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265592 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265595 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265598 2575 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265602 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265605 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265608 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265611 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265615 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265618 2575 flags.go:64] FLAG: --help="false" Apr 22 18:46:43.269759 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265621 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265624 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265627 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265630 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265634 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265638 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265641 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265644 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265647 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265650 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265653 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265656 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265659 2575 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265662 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265665 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265668 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265671 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265674 2575 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265677 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265680 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265683 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265689 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265692 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265694 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:43.270493 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265697 2575 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265700 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265704 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265706 2575 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265709 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265713 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265719 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265723 2575 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265726 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265729 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265732 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265735 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265738 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265741 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265744 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265751 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265754 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265757 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265761 2575 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265763 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265770 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265773 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265777 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265780 2575 flags.go:64] FLAG: --port="10250" Apr 22 18:46:43.271123 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265783 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265786 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b96b9214ad0b1e62" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265789 2575 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265792 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265795 2575 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265798 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265801 2575 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265805 2575 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265807 2575 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265810 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265813 2575 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265817 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265820 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265823 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265827 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265830 2575 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265833 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265836 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265839 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265842 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265845 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265848 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265851 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265855 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265857 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265860 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:43.271751 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265863 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265866 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265869 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265872 2575 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265875 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265882 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265885 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265887 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265892 2575 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265894 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265897 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265900 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265903 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265906 2575 flags.go:64] FLAG: --v="2" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265910 2575 flags.go:64] FLAG: --version="false" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265914 2575 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265919 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.265922 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266013 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266016 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266021 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266024 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266026 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:43.272405 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266029 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266032 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266035 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266037 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266041 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266044 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266047 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266049 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266052 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266054 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266057 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266060 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266062 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266065 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266069 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266072 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266075 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266077 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266080 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266082 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:43.272964 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266085 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266088 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266090 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266093 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266095 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266098 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266101 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266103 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266106 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266110 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266112 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266115 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266118 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266120 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266123 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266125 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266128 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266130 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266133 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266136 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:43.273483 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266138 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266141 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266143 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266145 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266149 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266152 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266155 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266158 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266160 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266163 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266166 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266168 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266171 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266174 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266176 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266179 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266181 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266184 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266186 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266189 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:43.273972 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266191 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266195 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266198 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266200 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266218 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266221 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266224 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266226 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266229 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266232 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266234 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266237 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266240 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266242 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266245 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266247 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266250 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266253 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266257 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:43.274551 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266262 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:43.275029 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.266265 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:43.275029 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.267163 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:43.275274 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.275197 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:43.275274 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.275231 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275282 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275288 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275292 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275295 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275298 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275302 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275306 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275309 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275313 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275316 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275319 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275321 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275324 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275327 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275329 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275332 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275335 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275338 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:43.275337 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275341 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275345 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275348 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275351 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275353 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275357 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275359 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275363 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275368 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275371 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275374 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275376 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275379 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275382 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275386 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275388 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275391 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275393 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275396 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:43.275806 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275398 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275401 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275404 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275406 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275408 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275411 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275414 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275416 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275419 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275421 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275424 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275426 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275429 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275431 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275434 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275438 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275441 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275443 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275447 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275449 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:43.276275 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275452 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275454 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275457 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275460 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275462 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275465 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275467 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275471 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275473 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275476 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275479 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275481 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275484 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275487 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275489 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275492 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275494 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275497 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275499 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275502 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:43.276765 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275504 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275507 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275509 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275512 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275515 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275517 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275520 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275523 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275525 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.275530 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275633 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275638 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275641 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275644 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275647 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275650 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:43.277252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275652 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275655 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275658 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275661 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275665 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275667 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275670 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275673 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275676 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275679 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275681 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275684 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275686 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275689 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275691 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275694 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275696 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275699 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275702 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275704 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:43.277648 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275707 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275710 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275712 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275715 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275717 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275720 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275723 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275725 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275728 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275730 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275733 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275735 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275738 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275740 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275743 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275745 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275748 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275751 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275754 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275756 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:43.278132 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275759 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275761 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275764 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275766 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275769 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275771 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275774 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275776 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275779 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275781 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275784 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275787 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275789 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275793 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275797 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275800 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275803 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275806 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275809 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:43.278655 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275812 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275815 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275818 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275821 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275824 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275826 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275829 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275831 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275834 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275836 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275838 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275841 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275844 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275846 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275849 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275852 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275854 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275857 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275859 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275862 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:43.279121 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:43.275865 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:43.279730 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.275870 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:43.279730 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.276672 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:43.280849 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.280835 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:43.281728 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.281716 2575 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:43.281829 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.281812 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:43.281863 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.281855 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:43.309011 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.308987 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:43.310787 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.310767 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:43.321930 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.321909 2575 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:43.327550 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.327522 2575 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:43.329421 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.329404 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:43.333761 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.333738 2575 fs.go:135] Filesystem UUIDs: map[28edcb30-475e-4500-87f0-a6bb3cdb1143:/dev/nvme0n1p4 3859142c-62cf-4cf5-88cf-1e19768a28da:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 18:46:43.333821 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.333761 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:43.337180 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.337161 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:43.339871 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.339761 2575 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:43.337821228 +0000 UTC m=+0.400305549 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3106561 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2cd5bed4501008147e7cac86dcfc05 SystemUUID:ec2cd5be-d450-1008-147e-7cac86dcfc05 BootID:de1851bd-5754-4742-9432-e0a508de4b4d Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cc:86:dd:3a:41 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cc:86:dd:3a:41 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6e:56:08:da:e8:6b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:43.339871 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.339867 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:43.339988 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.339962 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:43.340992 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.340970 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:43.341132 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.340996 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-109.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:43.341179 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.341143 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:43.341179 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.341153 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:43.341179 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.341166 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:43.341921 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.341910 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:43.343698 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.343685 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:43.344005 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.343996 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:43.346291 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.346279 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:43.346335 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.346298 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:43.346335 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.346315 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:43.346335 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.346326 2575 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:43.346433 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.346337 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:43.347456 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.347441 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:43.347514 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.347467 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:43.350182 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.350164 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:43.352001 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.351987 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:43.353342 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353320 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:43.353342 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353338 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:43.353342 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353344 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:43.353480 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353350 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:43.353480 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353356 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:43.353480 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353362 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:43.353480 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353368 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:43.353480 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353374 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:43.353480 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353380 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:43.353480 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353386 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:43.353480 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353406 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:43.353480 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.353415 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:43.354280 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.354271 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:43.354280 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.354281 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:43.357327 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.357301 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:46:43.357327 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.357320 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-109.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:46:43.357327 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.357302 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-109.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:46:43.358141 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.358130 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:43.358181 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.358169 2575 server.go:1295] "Started kubelet" Apr 22 18:46:43.358295 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.358266 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:43.358435 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.358373 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:43.358477 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.358464 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:43.359025 ip-10-0-134-109 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:43.360086 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.360070 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:43.365118 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.365090 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:43.365873 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.365846 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x4tx7" Apr 22 18:46:43.367620 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.367596 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:43.368100 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.368075 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:43.368522 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.367599 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-109.ec2.internal.18a8c2391160d1f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-109.ec2.internal,UID:ip-10-0-134-109.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-109.ec2.internal,},FirstTimestamp:2026-04-22 18:46:43.35814296 +0000 UTC m=+0.420627282,LastTimestamp:2026-04-22 18:46:43.35814296 +0000 UTC m=+0.420627282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-109.ec2.internal,}" Apr 22 18:46:43.368781 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.368762 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:43.368844 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.368821 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:43.368844 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.368840 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:43.368935 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.368914 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:43.368980 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.368971 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:43.368980 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.368978 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:43.369363 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.369347 2575 factory.go:153] Registering CRI-O factory Apr 22 18:46:43.369463 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.369368 2575 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:43.369463 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.369410 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:46:43.369463 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.369424 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:43.369463 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.369435 2575 factory.go:55] Registering systemd factory Apr 22 18:46:43.369463 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.369444 2575 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:43.369463 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.369465 2575 factory.go:103] Registering Raw factory Apr 22 18:46:43.369712 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.369481 2575 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:43.369837 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.369826 2575 manager.go:319] Starting recovery of all containers Apr 22 18:46:43.370924 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.370896 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x4tx7" Apr 22 18:46:43.375987 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.375818 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:43.378330 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.378303 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-134-109.ec2.internal\" not found" node="ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.380144 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.380123 2575 manager.go:324] Recovery completed Apr 22 18:46:43.384580 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.384566 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:43.387286 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.387271 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:43.387356 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.387300 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:43.387356 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.387320 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:43.387836 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.387822 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:43.387910 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.387835 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:43.387910 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.387865 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:43.390168 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.390154 2575 policy_none.go:49] "None policy: Start" Apr 22 18:46:43.390251 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.390174 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:43.390251 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.390187 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:43.431454 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.425831 2575 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:43.431454 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.425864 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:43.431454 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.425874 2575 server.go:85] "Starting device plugin registration server" Apr 22 18:46:43.431454 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.426163 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:43.431454 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.426174 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:43.431454 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.426383 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:43.431454 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.426461 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:43.431454 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.426468 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:43.431454 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.426907 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:43.431454 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.426938 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:43.463124 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.463075 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:43.464468 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.464439 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:43.464583 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.464490 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:43.464583 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.464522 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:43.464583 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.464535 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:43.464725 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.464599 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:43.467901 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.467881 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:43.526513 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.526440 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:43.527587 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.527567 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:43.527682 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.527601 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:43.527682 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.527611 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:43.527682 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.527635 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.535539 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.535524 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.535583 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.535548 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-109.ec2.internal\": node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:43.547295 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.547270 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:43.565668 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.565639 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal"] Apr 22 18:46:43.565762 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.565730 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:43.567643 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.567625 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:43.567729 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.567655 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:43.567729 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.567664 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:43.568837 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.568825 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:43.568990 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.568974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.569047 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.569007 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:43.569538 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.569519 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:43.569538 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.569520 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:43.569653 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.569551 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:43.569653 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.569552 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:43.569653 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.569563 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:43.569653 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.569569 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:43.570699 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.570687 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.570749 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.570710 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:43.571363 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.571346 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:43.571470 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.571387 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:43.571470 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.571400 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:43.589852 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.589821 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-109.ec2.internal\" not found" node="ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.593187 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.593170 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-109.ec2.internal\" not found" node="ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.648318 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.648287 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:43.670095 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.670067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7cab3aff7832fd5bf55174d8022bd378-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal\" (UID: \"7cab3aff7832fd5bf55174d8022bd378\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.670150 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.670100 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cab3aff7832fd5bf55174d8022bd378-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal\" (UID: \"7cab3aff7832fd5bf55174d8022bd378\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.670150 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.670120 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b12aca16166beec2da98d7da868da1f9-config\") pod \"kube-apiserver-proxy-ip-10-0-134-109.ec2.internal\" (UID: \"b12aca16166beec2da98d7da868da1f9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.748700 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.748652 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:43.771050 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.771015 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7cab3aff7832fd5bf55174d8022bd378-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal\" (UID: \"7cab3aff7832fd5bf55174d8022bd378\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.771167 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.771058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cab3aff7832fd5bf55174d8022bd378-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal\" (UID: \"7cab3aff7832fd5bf55174d8022bd378\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.771167 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.771103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7cab3aff7832fd5bf55174d8022bd378-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal\" (UID: \"7cab3aff7832fd5bf55174d8022bd378\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.771167 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.771118 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7cab3aff7832fd5bf55174d8022bd378-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal\" (UID: \"7cab3aff7832fd5bf55174d8022bd378\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.771305 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.771171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b12aca16166beec2da98d7da868da1f9-config\") pod \"kube-apiserver-proxy-ip-10-0-134-109.ec2.internal\" (UID: \"b12aca16166beec2da98d7da868da1f9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.771305 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.771226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b12aca16166beec2da98d7da868da1f9-config\") pod \"kube-apiserver-proxy-ip-10-0-134-109.ec2.internal\" (UID: \"b12aca16166beec2da98d7da868da1f9\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.849367 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.849298 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:43.891885 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.891862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.895307 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:43.895285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal" Apr 22 18:46:43.950034 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:43.949994 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:44.050506 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:44.050477 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:44.151119 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:44.151033 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:44.251582 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:44.251548 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:44.282116 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.282081 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:44.282626 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.282295 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:44.282626 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.282308 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:44.352701 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:44.352665 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:44.367840 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.367815 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:44.372769 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.372732 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:43 +0000 UTC" deadline="2027-10-22 09:16:29.926537071 +0000 UTC" Apr 22 18:46:44.372769 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.372766 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13142h29m45.553775545s" Apr 22 18:46:44.377415 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.377395 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:44.395664 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.395635 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tf97c" Apr 22 18:46:44.402426 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.402364 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tf97c" Apr 22 18:46:44.452979 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:44.452951 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:44.456984 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:44.456945 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cab3aff7832fd5bf55174d8022bd378.slice/crio-7a77b7c7722433098e083b236615ddf146c1f5664b4255180cc4bd3fc5d7fd1b WatchSource:0}: Error finding container 7a77b7c7722433098e083b236615ddf146c1f5664b4255180cc4bd3fc5d7fd1b: Status 404 returned error can't find the container with id 7a77b7c7722433098e083b236615ddf146c1f5664b4255180cc4bd3fc5d7fd1b Apr 22 18:46:44.457184 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:44.457166 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb12aca16166beec2da98d7da868da1f9.slice/crio-eeedaaa9a9af14b588d1103ecae0a422c87b42da7f971f1db75a3053af0669cc WatchSource:0}: Error finding container eeedaaa9a9af14b588d1103ecae0a422c87b42da7f971f1db75a3053af0669cc: Status 404 returned error can't find the container with id eeedaaa9a9af14b588d1103ecae0a422c87b42da7f971f1db75a3053af0669cc Apr 22 18:46:44.463426 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.463410 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:44.467765 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.467724 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" event={"ID":"7cab3aff7832fd5bf55174d8022bd378","Type":"ContainerStarted","Data":"7a77b7c7722433098e083b236615ddf146c1f5664b4255180cc4bd3fc5d7fd1b"} Apr 22 18:46:44.468492 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.468466 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal" event={"ID":"b12aca16166beec2da98d7da868da1f9","Type":"ContainerStarted","Data":"eeedaaa9a9af14b588d1103ecae0a422c87b42da7f971f1db75a3053af0669cc"} Apr 22 18:46:44.554042 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:44.554007 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-109.ec2.internal\" not found" Apr 22 18:46:44.616047 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.616023 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:44.669223 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.669124 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" Apr 22 18:46:44.679643 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.679609 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:44.681185 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.681164 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal" Apr 22 18:46:44.688837 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.688820 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:44.721240 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:44.721196 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:45.348137 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.348105 2575 apiserver.go:52] "Watching apiserver" Apr 22 18:46:45.358497 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.358465 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:45.359507 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.359472 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449","openshift-cluster-node-tuning-operator/tuned-4w69n","openshift-dns/node-resolver-kfn2f","openshift-image-registry/node-ca-rqzlh","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal","openshift-multus/multus-additional-cni-plugins-vdhgw","openshift-multus/multus-rlzcp","openshift-multus/network-metrics-daemon-zn7pv","openshift-network-diagnostics/network-check-target-84n9f","openshift-network-operator/iptables-alerter-rqxnk","openshift-ovn-kubernetes/ovnkube-node-p7bgj","kube-system/konnectivity-agent-vgg98"] Apr 22 18:46:45.361092 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.361066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.362086 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.362064 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.364438 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.364412 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:45.364551 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.364431 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kxtjg\"" Apr 22 18:46:45.364710 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.364692 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:45.365014 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.364977 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.365315 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.365297 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.365424 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.365366 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.365655 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.365637 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.366392 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.366371 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:45.366880 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.366787 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xtggb\"" Apr 22 18:46:45.367389 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.367356 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.368444 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.368042 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.368444 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.368168 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8kt4h\"" Apr 22 18:46:45.368444 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.368313 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.368444 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.368442 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:45.368686 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.368608 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.369612 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.368805 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.369698 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.369630 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.370760 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.370740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:45.370928 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:45.370902 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:46:45.371247 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.371228 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:45.371695 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.371595 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-cgt5g\"" Apr 22 18:46:45.372073 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.371813 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.372073 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.371899 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.372073 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.372012 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qqhr8\"" Apr 22 18:46:45.372073 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.372057 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-v9xh8\"" Apr 22 18:46:45.372417 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.372161 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.372861 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.372487 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:45.372861 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.372716 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.373982 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.373963 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:45.374079 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:45.374042 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:46:45.374141 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.374086 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.376550 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.376486 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jr8wm\"" Apr 22 18:46:45.376550 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.376530 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.376769 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.376560 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:45.376769 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.376724 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.377822 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.377268 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.378803 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.378439 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:46:45.379936 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.379909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea1be31b-74a3-48cb-b181-97ff279b206a-host\") pod \"node-ca-rqzlh\" (UID: \"ea1be31b-74a3-48cb-b181-97ff279b206a\") " pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.380043 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.379949 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-cni-dir\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380043 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.379977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-var-lib-kubelet\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380043 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-os-release\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.380198 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-sys-fs\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.380198 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380069 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-host\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.380198 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-socket-dir-parent\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380198 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380118 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-run-multus-certs\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380198 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380140 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-system-cni-dir\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.380198 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/46f110eb-3658-4771-b650-29f48ae2842a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.380198 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380192 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-os-release\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380549 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-run-k8s-cni-cncf-io\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380549 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380278 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-hostroot\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380549 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380307 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q68s\" (UniqueName: \"kubernetes.io/projected/46f110eb-3658-4771-b650-29f48ae2842a-kube-api-access-8q68s\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.380549 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-device-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.380549 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380355 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-sysctl-d\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.380549 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-var-lib-kubelet\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.380549 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-conf-dir\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380549 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-daemon-config\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380549 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-cnibin\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380567 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46f110eb-3658-4771-b650-29f48ae2842a-cni-binary-copy\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380547 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ql9v8\"" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-registration-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380601 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380617 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-systemd\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380638 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef120-7570-4923-8c42-9141c31054c0-etc-tuned\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380673 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af1a0441-5fdc-4aa3-ac3d-26b54d430c2b-tmp-dir\") pod \"node-resolver-kfn2f\" (UID: \"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b\") " pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ea1be31b-74a3-48cb-b181-97ff279b206a-serviceca\") pod \"node-ca-rqzlh\" (UID: \"ea1be31b-74a3-48cb-b181-97ff279b206a\") " pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380748 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-cnibin\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-sysconfig\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380793 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-sysctl-conf\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-lib-modules\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef120-7570-4923-8c42-9141c31054c0-tmp\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmb9s\" (UniqueName: \"kubernetes.io/projected/1a7ef120-7570-4923-8c42-9141c31054c0-kube-api-access-cmb9s\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380934 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-var-lib-cni-bin\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-var-lib-cni-multus\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.380989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.380980 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cf7h\" (UniqueName: \"kubernetes.io/projected/2db298c2-8f21-4074-b8fa-de93cd62c24a-kube-api-access-2cf7h\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381038 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-etc-selinux\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381108 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb9f2\" (UniqueName: \"kubernetes.io/projected/af1a0441-5fdc-4aa3-ac3d-26b54d430c2b-kube-api-access-vb9f2\") pod \"node-resolver-kfn2f\" (UID: \"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b\") " pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-system-cni-dir\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381177 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2db298c2-8f21-4074-b8fa-de93cd62c24a-cni-binary-copy\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-socket-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-modprobe-d\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381299 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-sys\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381390 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/af1a0441-5fdc-4aa3-ac3d-26b54d430c2b-hosts-file\") pod \"node-resolver-kfn2f\" (UID: \"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b\") " pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381417 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmdhv\" (UniqueName: \"kubernetes.io/projected/ea1be31b-74a3-48cb-b181-97ff279b206a-kube-api-access-xmdhv\") pod \"node-ca-rqzlh\" (UID: \"ea1be31b-74a3-48cb-b181-97ff279b206a\") " pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-run-netns\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-etc-kubernetes\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381470 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46f110eb-3658-4771-b650-29f48ae2842a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6zht\" (UniqueName: \"kubernetes.io/projected/82a7e0c0-777f-4215-8cb4-a68ede452c23-kube-api-access-j6zht\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.381768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381585 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-kubernetes\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.382570 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381603 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-run\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.382570 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381655 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:45.382570 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381857 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:45.382570 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381951 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:45.382570 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.381879 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rvxv5\"" Apr 22 18:46:45.382570 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.382141 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:45.382570 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.382193 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:45.382570 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.382385 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:45.403148 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.403065 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:44 +0000 UTC" deadline="2027-11-13 02:40:14.809436939 +0000 UTC" Apr 22 18:46:45.403148 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.403097 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13663h53m29.406343857s" Apr 22 18:46:45.470054 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.470024 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:45.482050 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-system-cni-dir\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.482257 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-modprobe-d\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.482257 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-run-netns\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.482257 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-run-openvswitch\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.482257 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482136 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-cni-bin\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.482257 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmdhv\" (UniqueName: \"kubernetes.io/projected/ea1be31b-74a3-48cb-b181-97ff279b206a-kube-api-access-xmdhv\") pod \"node-ca-rqzlh\" (UID: \"ea1be31b-74a3-48cb-b181-97ff279b206a\") " pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.482257 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-system-cni-dir\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.482257 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482184 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-etc-kubernetes\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.482257 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46f110eb-3658-4771-b650-29f48ae2842a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.482257 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482252 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-modprobe-d\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482260 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-etc-kubernetes\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-run\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nprkx\" (UniqueName: \"kubernetes.io/projected/16f54576-6941-4246-bcb0-89cfeef13253-kube-api-access-nprkx\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482464 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482485 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea1be31b-74a3-48cb-b181-97ff279b206a-host\") pod \"node-ca-rqzlh\" (UID: \"ea1be31b-74a3-48cb-b181-97ff279b206a\") " pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-cni-dir\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-var-lib-kubelet\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-os-release\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-sys-fs\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482593 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-cni-dir\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-host\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-var-lib-kubelet\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-run\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482649 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-os-release\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-sys-fs\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.482715 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482660 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ea1be31b-74a3-48cb-b181-97ff279b206a-host\") pod \"node-ca-rqzlh\" (UID: \"ea1be31b-74a3-48cb-b181-97ff279b206a\") " pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482666 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-host\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-log-socket\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-socket-dir-parent\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-run-multus-certs\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/46f110eb-3658-4771-b650-29f48ae2842a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-run-multus-certs\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482783 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-var-lib-openvswitch\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-socket-dir-parent\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482847 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcjg5\" (UniqueName: \"kubernetes.io/projected/ec7921c7-6ea3-40af-a74d-07b404ff9eb9-kube-api-access-vcjg5\") pod \"iptables-alerter-rqxnk\" (UID: \"ec7921c7-6ea3-40af-a74d-07b404ff9eb9\") " pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46f110eb-3658-4771-b650-29f48ae2842a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-device-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-sysctl-d\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482947 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-device-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb9f2\" (UniqueName: \"kubernetes.io/projected/af1a0441-5fdc-4aa3-ac3d-26b54d430c2b-kube-api-access-vb9f2\") pod \"node-resolver-kfn2f\" (UID: \"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b\") " pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.482988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-systemd-units\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483011 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-run-ovn\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.483554 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-node-log\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-daemon-config\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-sysctl-d\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-systemd\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483141 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef120-7570-4923-8c42-9141c31054c0-etc-tuned\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483160 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-slash\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ea1be31b-74a3-48cb-b181-97ff279b206a-serviceca\") pod \"node-ca-rqzlh\" (UID: \"ea1be31b-74a3-48cb-b181-97ff279b206a\") " pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483200 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-cnibin\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483246 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-systemd\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmb9s\" (UniqueName: \"kubernetes.io/projected/1a7ef120-7570-4923-8c42-9141c31054c0-kube-api-access-cmb9s\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/46f110eb-3658-4771-b650-29f48ae2842a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483279 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1-agent-certs\") pod \"konnectivity-agent-vgg98\" (UID: \"7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1\") " pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec7921c7-6ea3-40af-a74d-07b404ff9eb9-host-slash\") pod \"iptables-alerter-rqxnk\" (UID: \"ec7921c7-6ea3-40af-a74d-07b404ff9eb9\") " pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483333 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-cnibin\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-var-lib-cni-multus\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483374 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2cf7h\" (UniqueName: \"kubernetes.io/projected/2db298c2-8f21-4074-b8fa-de93cd62c24a-kube-api-access-2cf7h\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-var-lib-cni-multus\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.484176 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483402 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e473ff71-80f3-4edf-8096-6fa108acae8a-ovn-node-metrics-cert\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ec7921c7-6ea3-40af-a74d-07b404ff9eb9-iptables-alerter-script\") pod \"iptables-alerter-rqxnk\" (UID: \"ec7921c7-6ea3-40af-a74d-07b404ff9eb9\") " pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483460 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2db298c2-8f21-4074-b8fa-de93cd62c24a-cni-binary-copy\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-socket-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-sys\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/af1a0441-5fdc-4aa3-ac3d-26b54d430c2b-hosts-file\") pod \"node-resolver-kfn2f\" (UID: \"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b\") " pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/af1a0441-5fdc-4aa3-ac3d-26b54d430c2b-hosts-file\") pod \"node-resolver-kfn2f\" (UID: \"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b\") " pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-socket-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483704 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-daemon-config\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483709 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ea1be31b-74a3-48cb-b181-97ff279b206a-serviceca\") pod \"node-ca-rqzlh\" (UID: \"ea1be31b-74a3-48cb-b181-97ff279b206a\") " pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-sys\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483505 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483764 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e473ff71-80f3-4edf-8096-6fa108acae8a-ovnkube-config\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483784 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1-konnectivity-ca\") pod \"konnectivity-agent-vgg98\" (UID: \"7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1\") " pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483823 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-run-netns\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6zht\" (UniqueName: \"kubernetes.io/projected/82a7e0c0-777f-4215-8cb4-a68ede452c23-kube-api-access-j6zht\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-run-netns\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-kubernetes\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.485032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483931 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-run-systemd\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483937 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2db298c2-8f21-4074-b8fa-de93cd62c24a-cni-binary-copy\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-system-cni-dir\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.483988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c44v9\" (UniqueName: \"kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9\") pod \"network-check-target-84n9f\" (UID: \"f5a171a3-924e-421e-a715-95ec17243358\") " pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-kubernetes\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484014 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484042 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e473ff71-80f3-4edf-8096-6fa108acae8a-env-overrides\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-system-cni-dir\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-os-release\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484085 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-run-k8s-cni-cncf-io\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484100 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-hostroot\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8q68s\" (UniqueName: \"kubernetes.io/projected/46f110eb-3658-4771-b650-29f48ae2842a-kube-api-access-8q68s\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-os-release\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-run-k8s-cni-cncf-io\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-hostroot\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484184 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-var-lib-kubelet\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484253 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-var-lib-kubelet\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.485670 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484256 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-cni-netd\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484325 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-conf-dir\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484350 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-cnibin\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46f110eb-3658-4771-b650-29f48ae2842a-cni-binary-copy\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-multus-conf-dir\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484414 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-registration-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-cnibin\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af1a0441-5fdc-4aa3-ac3d-26b54d430c2b-tmp-dir\") pod \"node-resolver-kfn2f\" (UID: \"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b\") " pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-kubelet\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-registration-dir\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484517 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-run-ovn-kubernetes\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e473ff71-80f3-4edf-8096-6fa108acae8a-ovnkube-script-lib\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-sysconfig\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484601 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-sysctl-conf\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-lib-modules\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-sysconfig\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef120-7570-4923-8c42-9141c31054c0-tmp\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.486275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484701 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-etc-openvswitch\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-var-lib-cni-bin\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-lib-modules\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/1a7ef120-7570-4923-8c42-9141c31054c0-etc-sysctl-conf\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-etc-selinux\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2db298c2-8f21-4074-b8fa-de93cd62c24a-host-var-lib-cni-bin\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/af1a0441-5fdc-4aa3-ac3d-26b54d430c2b-tmp-dir\") pod \"node-resolver-kfn2f\" (UID: \"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b\") " pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j977\" (UniqueName: \"kubernetes.io/projected/e473ff71-80f3-4edf-8096-6fa108acae8a-kube-api-access-7j977\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46f110eb-3658-4771-b650-29f48ae2842a-cni-binary-copy\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484897 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46f110eb-3658-4771-b650-29f48ae2842a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.486915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.484976 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/82a7e0c0-777f-4215-8cb4-a68ede452c23-etc-selinux\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.487466 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.486996 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef120-7570-4923-8c42-9141c31054c0-etc-tuned\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.487466 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.487040 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1a7ef120-7570-4923-8c42-9141c31054c0-tmp\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.494055 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.493935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb9f2\" (UniqueName: \"kubernetes.io/projected/af1a0441-5fdc-4aa3-ac3d-26b54d430c2b-kube-api-access-vb9f2\") pod \"node-resolver-kfn2f\" (UID: \"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b\") " pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.494055 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.493935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmdhv\" (UniqueName: \"kubernetes.io/projected/ea1be31b-74a3-48cb-b181-97ff279b206a-kube-api-access-xmdhv\") pod \"node-ca-rqzlh\" (UID: \"ea1be31b-74a3-48cb-b181-97ff279b206a\") " pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.494458 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.494437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cf7h\" (UniqueName: \"kubernetes.io/projected/2db298c2-8f21-4074-b8fa-de93cd62c24a-kube-api-access-2cf7h\") pod \"multus-rlzcp\" (UID: \"2db298c2-8f21-4074-b8fa-de93cd62c24a\") " pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.494856 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.494834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q68s\" (UniqueName: \"kubernetes.io/projected/46f110eb-3658-4771-b650-29f48ae2842a-kube-api-access-8q68s\") pod \"multus-additional-cni-plugins-vdhgw\" (UID: \"46f110eb-3658-4771-b650-29f48ae2842a\") " pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.495131 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.495114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6zht\" (UniqueName: \"kubernetes.io/projected/82a7e0c0-777f-4215-8cb4-a68ede452c23-kube-api-access-j6zht\") pod \"aws-ebs-csi-driver-node-mv449\" (UID: \"82a7e0c0-777f-4215-8cb4-a68ede452c23\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.496182 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.496143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmb9s\" (UniqueName: \"kubernetes.io/projected/1a7ef120-7570-4923-8c42-9141c31054c0-kube-api-access-cmb9s\") pod \"tuned-4w69n\" (UID: \"1a7ef120-7570-4923-8c42-9141c31054c0\") " pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.586034 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.585999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7j977\" (UniqueName: \"kubernetes.io/projected/e473ff71-80f3-4edf-8096-6fa108acae8a-kube-api-access-7j977\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586034 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-run-netns\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-run-openvswitch\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-cni-bin\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586126 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nprkx\" (UniqueName: \"kubernetes.io/projected/16f54576-6941-4246-bcb0-89cfeef13253-kube-api-access-nprkx\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-run-netns\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-run-openvswitch\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-log-socket\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586190 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-cni-bin\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586237 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-log-socket\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586238 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-var-lib-openvswitch\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:45.586252 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcjg5\" (UniqueName: \"kubernetes.io/projected/ec7921c7-6ea3-40af-a74d-07b404ff9eb9-kube-api-access-vcjg5\") pod \"iptables-alerter-rqxnk\" (UID: \"ec7921c7-6ea3-40af-a74d-07b404ff9eb9\") " pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.586294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586296 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-systemd-units\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586331 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-systemd-units\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:45.586340 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs podName:16f54576-6941-4246-bcb0-89cfeef13253 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.086305343 +0000 UTC m=+3.148789666 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs") pod "network-metrics-daemon-zn7pv" (UID: "16f54576-6941-4246-bcb0-89cfeef13253") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-var-lib-openvswitch\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-run-ovn\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-node-log\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-slash\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586474 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1-agent-certs\") pod \"konnectivity-agent-vgg98\" (UID: \"7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1\") " pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586486 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-run-ovn\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec7921c7-6ea3-40af-a74d-07b404ff9eb9-host-slash\") pod \"iptables-alerter-rqxnk\" (UID: \"ec7921c7-6ea3-40af-a74d-07b404ff9eb9\") " pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-node-log\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-slash\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586525 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e473ff71-80f3-4edf-8096-6fa108acae8a-ovn-node-metrics-cert\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ec7921c7-6ea3-40af-a74d-07b404ff9eb9-host-slash\") pod \"iptables-alerter-rqxnk\" (UID: \"ec7921c7-6ea3-40af-a74d-07b404ff9eb9\") " pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ec7921c7-6ea3-40af-a74d-07b404ff9eb9-iptables-alerter-script\") pod \"iptables-alerter-rqxnk\" (UID: \"ec7921c7-6ea3-40af-a74d-07b404ff9eb9\") " pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586585 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e473ff71-80f3-4edf-8096-6fa108acae8a-ovnkube-config\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1-konnectivity-ca\") pod \"konnectivity-agent-vgg98\" (UID: \"7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1\") " pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:46:45.586959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586739 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-run-systemd\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c44v9\" (UniqueName: \"kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9\") pod \"network-check-target-84n9f\" (UID: \"f5a171a3-924e-421e-a715-95ec17243358\") " pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586835 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e473ff71-80f3-4edf-8096-6fa108acae8a-env-overrides\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586867 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-cni-netd\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-kubelet\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-run-ovn-kubernetes\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586946 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-run-systemd\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586951 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e473ff71-80f3-4edf-8096-6fa108acae8a-ovnkube-script-lib\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.586993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.587027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-kubelet\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.587177 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1-konnectivity-ca\") pod \"konnectivity-agent-vgg98\" (UID: \"7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1\") " pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.587191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-run-ovn-kubernetes\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.587194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e473ff71-80f3-4edf-8096-6fa108acae8a-ovnkube-config\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.587248 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ec7921c7-6ea3-40af-a74d-07b404ff9eb9-iptables-alerter-script\") pod \"iptables-alerter-rqxnk\" (UID: \"ec7921c7-6ea3-40af-a74d-07b404ff9eb9\") " pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.587250 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-host-cni-netd\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.587314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-etc-openvswitch\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.587797 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.587377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e473ff71-80f3-4edf-8096-6fa108acae8a-etc-openvswitch\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.588656 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.587435 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e473ff71-80f3-4edf-8096-6fa108acae8a-env-overrides\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.588656 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.587689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e473ff71-80f3-4edf-8096-6fa108acae8a-ovnkube-script-lib\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.589179 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.589159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1-agent-certs\") pod \"konnectivity-agent-vgg98\" (UID: \"7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1\") " pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:46:45.589348 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.589332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e473ff71-80f3-4edf-8096-6fa108acae8a-ovn-node-metrics-cert\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.592759 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:45.592737 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:45.592836 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:45.592763 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:45.592836 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:45.592777 2575 projected.go:194] Error preparing data for projected volume kube-api-access-c44v9 for pod openshift-network-diagnostics/network-check-target-84n9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:45.592923 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:45.592858 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9 podName:f5a171a3-924e-421e-a715-95ec17243358 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.092836159 +0000 UTC m=+3.155320469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c44v9" (UniqueName: "kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9") pod "network-check-target-84n9f" (UID: "f5a171a3-924e-421e-a715-95ec17243358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:45.594395 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.594367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nprkx\" (UniqueName: \"kubernetes.io/projected/16f54576-6941-4246-bcb0-89cfeef13253-kube-api-access-nprkx\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:45.594557 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.594540 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcjg5\" (UniqueName: \"kubernetes.io/projected/ec7921c7-6ea3-40af-a74d-07b404ff9eb9-kube-api-access-vcjg5\") pod \"iptables-alerter-rqxnk\" (UID: \"ec7921c7-6ea3-40af-a74d-07b404ff9eb9\") " pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.594785 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.594763 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j977\" (UniqueName: \"kubernetes.io/projected/e473ff71-80f3-4edf-8096-6fa108acae8a-kube-api-access-7j977\") pod \"ovnkube-node-p7bgj\" (UID: \"e473ff71-80f3-4edf-8096-6fa108acae8a\") " pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.677263 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.677171 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" Apr 22 18:46:45.683949 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.683921 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" Apr 22 18:46:45.693744 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.693713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kfn2f" Apr 22 18:46:45.699395 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.699358 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rqzlh" Apr 22 18:46:45.708158 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.708128 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4w69n" Apr 22 18:46:45.716930 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.716903 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rlzcp" Apr 22 18:46:45.724721 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.724695 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rqxnk" Apr 22 18:46:45.731505 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.731479 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:46:45.738265 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.738241 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:46:45.771347 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:45.771305 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:46.090388 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.090295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:46.090550 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:46.090475 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.090609 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:46.090554 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs podName:16f54576-6941-4246-bcb0-89cfeef13253 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:47.090534205 +0000 UTC m=+4.153018533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs") pod "network-metrics-daemon-zn7pv" (UID: "16f54576-6941-4246-bcb0-89cfeef13253") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.119333 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:46.119303 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec7921c7_6ea3_40af_a74d_07b404ff9eb9.slice/crio-56443a6055f9a8e4f3a391ae8d908f52b669981fde608364d2d871554a18ee3c WatchSource:0}: Error finding container 56443a6055f9a8e4f3a391ae8d908f52b669981fde608364d2d871554a18ee3c: Status 404 returned error can't find the container with id 56443a6055f9a8e4f3a391ae8d908f52b669981fde608364d2d871554a18ee3c Apr 22 18:46:46.120607 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:46.120584 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d7e2f0a_fdb6_465e_81f4_dd71e4ceb2e1.slice/crio-89c418496d51f9ff87b125e1fb6f381cb8c06d035b71290859bc49f2767ed482 WatchSource:0}: Error finding container 89c418496d51f9ff87b125e1fb6f381cb8c06d035b71290859bc49f2767ed482: Status 404 returned error can't find the container with id 89c418496d51f9ff87b125e1fb6f381cb8c06d035b71290859bc49f2767ed482 Apr 22 18:46:46.121493 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:46.121466 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a7ef120_7570_4923_8c42_9141c31054c0.slice/crio-cd793d12979521a8e02e9c9313986e2c296fc0d764678df1730274f1119bb94c WatchSource:0}: Error finding container cd793d12979521a8e02e9c9313986e2c296fc0d764678df1730274f1119bb94c: Status 404 returned error can't find the container with id cd793d12979521a8e02e9c9313986e2c296fc0d764678df1730274f1119bb94c Apr 22 18:46:46.124746 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:46.124720 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea1be31b_74a3_48cb_b181_97ff279b206a.slice/crio-c329689dcb1a0f6596c8b9b8e9c5192faa3422ba7c109c98d4854d8747e899e9 WatchSource:0}: Error finding container c329689dcb1a0f6596c8b9b8e9c5192faa3422ba7c109c98d4854d8747e899e9: Status 404 returned error can't find the container with id c329689dcb1a0f6596c8b9b8e9c5192faa3422ba7c109c98d4854d8747e899e9 Apr 22 18:46:46.125600 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:46.125557 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46f110eb_3658_4771_b650_29f48ae2842a.slice/crio-c27126bf50ecf8eedb4109b07a1da2ec711e478a7ef2a486ddef8ec62cf3844c WatchSource:0}: Error finding container c27126bf50ecf8eedb4109b07a1da2ec711e478a7ef2a486ddef8ec62cf3844c: Status 404 returned error can't find the container with id c27126bf50ecf8eedb4109b07a1da2ec711e478a7ef2a486ddef8ec62cf3844c Apr 22 18:46:46.127098 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:46.126947 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82a7e0c0_777f_4215_8cb4_a68ede452c23.slice/crio-1665018f3a73cd1d99aad99cde2c703962be0700fc55234fe28507cdb7ce1f79 WatchSource:0}: Error finding container 1665018f3a73cd1d99aad99cde2c703962be0700fc55234fe28507cdb7ce1f79: Status 404 returned error can't find the container with id 1665018f3a73cd1d99aad99cde2c703962be0700fc55234fe28507cdb7ce1f79 Apr 22 18:46:46.127975 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:46.127951 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db298c2_8f21_4074_b8fa_de93cd62c24a.slice/crio-0d2f041bc395232e933b1ba651f022b28bdbc7f8b3f9fe741e74701ad7b04e5c WatchSource:0}: Error finding container 0d2f041bc395232e933b1ba651f022b28bdbc7f8b3f9fe741e74701ad7b04e5c: Status 404 returned error can't find the container with id 0d2f041bc395232e933b1ba651f022b28bdbc7f8b3f9fe741e74701ad7b04e5c Apr 22 18:46:46.129371 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:46.129349 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode473ff71_80f3_4edf_8096_6fa108acae8a.slice/crio-63400a3f36db6402b79a6684e12ca6c4e0f9add43289cd13819d9b2ffbfd4ce9 WatchSource:0}: Error finding container 63400a3f36db6402b79a6684e12ca6c4e0f9add43289cd13819d9b2ffbfd4ce9: Status 404 returned error can't find the container with id 63400a3f36db6402b79a6684e12ca6c4e0f9add43289cd13819d9b2ffbfd4ce9 Apr 22 18:46:46.129677 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:46:46.129648 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf1a0441_5fdc_4aa3_ac3d_26b54d430c2b.slice/crio-1ea8c469695d4cdf6533f3fe14a1593795d446481ade34169ed1cb9e453a25bf WatchSource:0}: Error finding container 1ea8c469695d4cdf6533f3fe14a1593795d446481ade34169ed1cb9e453a25bf: Status 404 returned error can't find the container with id 1ea8c469695d4cdf6533f3fe14a1593795d446481ade34169ed1cb9e453a25bf Apr 22 18:46:46.191640 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.191477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c44v9\" (UniqueName: \"kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9\") pod \"network-check-target-84n9f\" (UID: \"f5a171a3-924e-421e-a715-95ec17243358\") " pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:46.191775 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:46.191646 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:46.191775 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:46.191666 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:46.191775 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:46.191676 2575 projected.go:194] Error preparing data for projected volume kube-api-access-c44v9 for pod openshift-network-diagnostics/network-check-target-84n9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.191775 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:46.191724 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9 podName:f5a171a3-924e-421e-a715-95ec17243358 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:47.191710412 +0000 UTC m=+4.254194722 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c44v9" (UniqueName: "kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9") pod "network-check-target-84n9f" (UID: "f5a171a3-924e-421e-a715-95ec17243358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.403736 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.403611 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:44 +0000 UTC" deadline="2027-12-30 01:46:33.114139036 +0000 UTC" Apr 22 18:46:46.403736 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.403645 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14790h59m46.710496525s" Apr 22 18:46:46.473196 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.473139 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kfn2f" event={"ID":"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b","Type":"ContainerStarted","Data":"1ea8c469695d4cdf6533f3fe14a1593795d446481ade34169ed1cb9e453a25bf"} Apr 22 18:46:46.475882 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.475835 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" event={"ID":"e473ff71-80f3-4edf-8096-6fa108acae8a","Type":"ContainerStarted","Data":"63400a3f36db6402b79a6684e12ca6c4e0f9add43289cd13819d9b2ffbfd4ce9"} Apr 22 18:46:46.477905 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.477874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rlzcp" event={"ID":"2db298c2-8f21-4074-b8fa-de93cd62c24a","Type":"ContainerStarted","Data":"0d2f041bc395232e933b1ba651f022b28bdbc7f8b3f9fe741e74701ad7b04e5c"} Apr 22 18:46:46.479048 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.479022 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4w69n" event={"ID":"1a7ef120-7570-4923-8c42-9141c31054c0","Type":"ContainerStarted","Data":"cd793d12979521a8e02e9c9313986e2c296fc0d764678df1730274f1119bb94c"} Apr 22 18:46:46.480188 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.480162 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vgg98" event={"ID":"7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1","Type":"ContainerStarted","Data":"89c418496d51f9ff87b125e1fb6f381cb8c06d035b71290859bc49f2767ed482"} Apr 22 18:46:46.482118 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.482091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal" event={"ID":"b12aca16166beec2da98d7da868da1f9","Type":"ContainerStarted","Data":"7ccdc8e8bc4318588c6d67d3ab126c8e087711ca8f88ab5836656af1c2cf400f"} Apr 22 18:46:46.483578 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.483547 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" event={"ID":"46f110eb-3658-4771-b650-29f48ae2842a","Type":"ContainerStarted","Data":"c27126bf50ecf8eedb4109b07a1da2ec711e478a7ef2a486ddef8ec62cf3844c"} Apr 22 18:46:46.486032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.485896 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" event={"ID":"82a7e0c0-777f-4215-8cb4-a68ede452c23","Type":"ContainerStarted","Data":"1665018f3a73cd1d99aad99cde2c703962be0700fc55234fe28507cdb7ce1f79"} Apr 22 18:46:46.487518 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.487478 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rqzlh" event={"ID":"ea1be31b-74a3-48cb-b181-97ff279b206a","Type":"ContainerStarted","Data":"c329689dcb1a0f6596c8b9b8e9c5192faa3422ba7c109c98d4854d8747e899e9"} Apr 22 18:46:46.489848 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.489821 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rqxnk" event={"ID":"ec7921c7-6ea3-40af-a74d-07b404ff9eb9","Type":"ContainerStarted","Data":"56443a6055f9a8e4f3a391ae8d908f52b669981fde608364d2d871554a18ee3c"} Apr 22 18:46:46.496749 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:46.496695 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-109.ec2.internal" podStartSLOduration=2.496677806 podStartE2EDuration="2.496677806s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:46.496373227 +0000 UTC m=+3.558857559" watchObservedRunningTime="2026-04-22 18:46:46.496677806 +0000 UTC m=+3.559162138" Apr 22 18:46:47.099226 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:47.098578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:47.099226 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:47.098746 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:47.099226 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:47.098812 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs podName:16f54576-6941-4246-bcb0-89cfeef13253 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:49.098793075 +0000 UTC m=+6.161277389 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs") pod "network-metrics-daemon-zn7pv" (UID: "16f54576-6941-4246-bcb0-89cfeef13253") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:47.200228 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:47.199546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c44v9\" (UniqueName: \"kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9\") pod \"network-check-target-84n9f\" (UID: \"f5a171a3-924e-421e-a715-95ec17243358\") " pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:47.200228 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:47.199752 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:47.200228 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:47.199770 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:47.200228 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:47.199783 2575 projected.go:194] Error preparing data for projected volume kube-api-access-c44v9 for pod openshift-network-diagnostics/network-check-target-84n9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:47.200228 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:47.199839 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9 podName:f5a171a3-924e-421e-a715-95ec17243358 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:49.199821562 +0000 UTC m=+6.262305890 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c44v9" (UniqueName: "kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9") pod "network-check-target-84n9f" (UID: "f5a171a3-924e-421e-a715-95ec17243358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:47.467995 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:47.467913 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:47.468439 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:47.468041 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:46:47.468513 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:47.468481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:47.468628 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:47.468603 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:46:47.495517 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:47.494570 2575 generic.go:358] "Generic (PLEG): container finished" podID="7cab3aff7832fd5bf55174d8022bd378" containerID="357c5b20a4d31f6681a34a1c7330300444b603ffce8f672bd411da39334cc593" exitCode=0 Apr 22 18:46:47.495517 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:47.495105 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" event={"ID":"7cab3aff7832fd5bf55174d8022bd378","Type":"ContainerDied","Data":"357c5b20a4d31f6681a34a1c7330300444b603ffce8f672bd411da39334cc593"} Apr 22 18:46:48.508506 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:48.508456 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" event={"ID":"7cab3aff7832fd5bf55174d8022bd378","Type":"ContainerStarted","Data":"ad8b8e84cf5cec6eb16dacf073710f8b46811390e48674de9b5061a0ab9f6504"} Apr 22 18:46:49.115031 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:49.114343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:49.115031 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:49.114502 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:49.115031 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:49.114682 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs podName:16f54576-6941-4246-bcb0-89cfeef13253 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:53.11466177 +0000 UTC m=+10.177146084 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs") pod "network-metrics-daemon-zn7pv" (UID: "16f54576-6941-4246-bcb0-89cfeef13253") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:49.215153 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:49.214973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c44v9\" (UniqueName: \"kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9\") pod \"network-check-target-84n9f\" (UID: \"f5a171a3-924e-421e-a715-95ec17243358\") " pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:49.215153 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:49.215147 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:49.215406 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:49.215168 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:49.215406 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:49.215182 2575 projected.go:194] Error preparing data for projected volume kube-api-access-c44v9 for pod openshift-network-diagnostics/network-check-target-84n9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:49.215406 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:49.215260 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9 podName:f5a171a3-924e-421e-a715-95ec17243358 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:53.215242034 +0000 UTC m=+10.277726346 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c44v9" (UniqueName: "kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9") pod "network-check-target-84n9f" (UID: "f5a171a3-924e-421e-a715-95ec17243358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:49.468366 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:49.468166 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:49.468366 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:49.468240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:49.468587 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:49.468461 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:46:49.468983 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:49.468952 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:46:51.464893 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:51.464786 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:51.464893 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:51.464817 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:51.465505 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:51.464928 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:46:51.465505 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:51.465384 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:46:53.149487 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.149443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:53.149969 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.149597 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:53.149969 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.149682 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs podName:16f54576-6941-4246-bcb0-89cfeef13253 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:01.149659902 +0000 UTC m=+18.212144228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs") pod "network-metrics-daemon-zn7pv" (UID: "16f54576-6941-4246-bcb0-89cfeef13253") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:53.250651 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.250554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c44v9\" (UniqueName: \"kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9\") pod \"network-check-target-84n9f\" (UID: \"f5a171a3-924e-421e-a715-95ec17243358\") " pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:53.250811 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.250756 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:53.250811 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.250782 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:53.250811 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.250797 2575 projected.go:194] Error preparing data for projected volume kube-api-access-c44v9 for pod openshift-network-diagnostics/network-check-target-84n9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:53.250984 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.250864 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9 podName:f5a171a3-924e-421e-a715-95ec17243358 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:01.250845031 +0000 UTC m=+18.313329356 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c44v9" (UniqueName: "kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9") pod "network-check-target-84n9f" (UID: "f5a171a3-924e-421e-a715-95ec17243358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:53.359536 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.359469 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-109.ec2.internal" podStartSLOduration=9.359448683 podStartE2EDuration="9.359448683s" podCreationTimestamp="2026-04-22 18:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:48.525274439 +0000 UTC m=+5.587758771" watchObservedRunningTime="2026-04-22 18:46:53.359448683 +0000 UTC m=+10.421933016" Apr 22 18:46:53.360473 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.359712 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bh2v7"] Apr 22 18:46:53.368462 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.368436 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:53.368617 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.368527 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:46:53.452513 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.452437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d1699627-d338-4301-b677-d1b46965f511-dbus\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:53.452513 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.452489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d1699627-d338-4301-b677-d1b46965f511-kubelet-config\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:53.452719 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.452543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:53.470113 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.470074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:53.470304 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.470266 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:46:53.471568 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.470083 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:53.471568 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.470812 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:46:53.552996 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.552958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d1699627-d338-4301-b677-d1b46965f511-dbus\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:53.553165 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.553012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d1699627-d338-4301-b677-d1b46965f511-kubelet-config\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:53.553165 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.553066 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:53.553319 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.553169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/d1699627-d338-4301-b677-d1b46965f511-dbus\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:53.553319 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:53.553169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/d1699627-d338-4301-b677-d1b46965f511-kubelet-config\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:53.553319 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.553305 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:53.553480 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:53.553372 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret podName:d1699627-d338-4301-b677-d1b46965f511 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:54.053352959 +0000 UTC m=+11.115837283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret") pod "global-pull-secret-syncer-bh2v7" (UID: "d1699627-d338-4301-b677-d1b46965f511") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:54.058243 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:54.058128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:54.058448 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:54.058310 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:54.058448 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:54.058390 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret podName:d1699627-d338-4301-b677-d1b46965f511 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:55.058368041 +0000 UTC m=+12.120852353 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret") pod "global-pull-secret-syncer-bh2v7" (UID: "d1699627-d338-4301-b677-d1b46965f511") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:55.066279 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:55.066182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:55.066779 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:55.066376 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:55.066779 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:55.066457 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret podName:d1699627-d338-4301-b677-d1b46965f511 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:57.066436859 +0000 UTC m=+14.128921183 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret") pod "global-pull-secret-syncer-bh2v7" (UID: "d1699627-d338-4301-b677-d1b46965f511") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:55.464975 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:55.464884 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:55.464975 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:55.464926 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:55.464975 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:55.464884 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:55.465280 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:55.465032 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:46:55.465280 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:55.465102 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:46:55.465280 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:55.465191 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:46:57.079611 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:57.079573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:57.080008 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:57.079712 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:57.080008 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:57.079779 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret podName:d1699627-d338-4301-b677-d1b46965f511 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:01.079759359 +0000 UTC m=+18.142243668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret") pod "global-pull-secret-syncer-bh2v7" (UID: "d1699627-d338-4301-b677-d1b46965f511") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:57.465852 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:57.465763 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:57.465852 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:57.465793 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:57.465852 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:57.465768 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:57.466111 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:57.465888 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:46:57.466111 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:57.465992 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:46:57.466111 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:57.466095 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:46:59.465702 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:59.465660 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:46:59.466174 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:59.465789 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:46:59.466174 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:46:59.465813 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:46:59.466174 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:59.465792 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:46:59.466174 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:59.465937 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:46:59.466174 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:46:59.466012 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:47:01.110490 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:01.110455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:01.110906 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.110580 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:01.110906 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.110647 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret podName:d1699627-d338-4301-b677-d1b46965f511 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:09.110629039 +0000 UTC m=+26.173113356 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret") pod "global-pull-secret-syncer-bh2v7" (UID: "d1699627-d338-4301-b677-d1b46965f511") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:01.211659 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:01.211619 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:01.211827 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.211807 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:01.211901 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.211888 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs podName:16f54576-6941-4246-bcb0-89cfeef13253 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.211867196 +0000 UTC m=+34.274351520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs") pod "network-metrics-daemon-zn7pv" (UID: "16f54576-6941-4246-bcb0-89cfeef13253") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:01.312332 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:01.312291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c44v9\" (UniqueName: \"kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9\") pod \"network-check-target-84n9f\" (UID: \"f5a171a3-924e-421e-a715-95ec17243358\") " pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:01.312516 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.312431 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:01.312516 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.312455 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:01.312516 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.312468 2575 projected.go:194] Error preparing data for projected volume kube-api-access-c44v9 for pod openshift-network-diagnostics/network-check-target-84n9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:01.312660 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.312540 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9 podName:f5a171a3-924e-421e-a715-95ec17243358 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.312519559 +0000 UTC m=+34.375003892 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c44v9" (UniqueName: "kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9") pod "network-check-target-84n9f" (UID: "f5a171a3-924e-421e-a715-95ec17243358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:01.464793 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:01.464713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:01.464947 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:01.464708 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:01.464947 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.464831 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:47:01.464947 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:01.464710 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:01.464947 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.464893 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:47:01.465133 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:01.464995 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:47:03.465971 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:03.465912 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:03.466311 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:03.465994 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:47:03.466311 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:03.466031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:03.466311 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:03.466082 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:47:03.466311 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:03.466110 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:03.466311 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:03.466156 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:47:04.542598 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.542152 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rlzcp" event={"ID":"2db298c2-8f21-4074-b8fa-de93cd62c24a","Type":"ContainerStarted","Data":"a0802226b51b3d909c1a4545ffa38bfcf87e9a0fe30548d00e080ba0cda8a436"} Apr 22 18:47:04.543503 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.543480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4w69n" event={"ID":"1a7ef120-7570-4923-8c42-9141c31054c0","Type":"ContainerStarted","Data":"1d5e7a78363dbff510da37d0bebd03eb9ddd926b1e07730627bce0852f63d37c"} Apr 22 18:47:04.544684 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.544665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vgg98" event={"ID":"7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1","Type":"ContainerStarted","Data":"5c10098efebdac833d5154ec1b3f357cc36d3117e7172d4fdfd38d9d28838055"} Apr 22 18:47:04.545922 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.545895 2575 generic.go:358] "Generic (PLEG): container finished" podID="46f110eb-3658-4771-b650-29f48ae2842a" containerID="986e58cf5ceb4d501f7df32cade4ebb13ea5314cb681ee0ea81c55bb574716df" exitCode=0 Apr 22 18:47:04.546029 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.545973 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" event={"ID":"46f110eb-3658-4771-b650-29f48ae2842a","Type":"ContainerDied","Data":"986e58cf5ceb4d501f7df32cade4ebb13ea5314cb681ee0ea81c55bb574716df"} Apr 22 18:47:04.547347 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.547229 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" event={"ID":"82a7e0c0-777f-4215-8cb4-a68ede452c23","Type":"ContainerStarted","Data":"ec9dbc0fb59bb4a6b61cc48893046220ba62eb63cb1fc78fb76d82ac21283894"} Apr 22 18:47:04.548332 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.548314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rqzlh" event={"ID":"ea1be31b-74a3-48cb-b181-97ff279b206a","Type":"ContainerStarted","Data":"a1d4503a23a9250e6e0fa8c3363de957c5037375fe68f1c9b1cff358aa3bf8c4"} Apr 22 18:47:04.549512 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.549487 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kfn2f" event={"ID":"af1a0441-5fdc-4aa3-ac3d-26b54d430c2b","Type":"ContainerStarted","Data":"a8aab507ac93aa9d8e068bb5509823fd9bf8cdc9262249acbdd47ea69b675708"} Apr 22 18:47:04.551557 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.551540 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 18:47:04.551855 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.551837 2575 generic.go:358] "Generic (PLEG): container finished" podID="e473ff71-80f3-4edf-8096-6fa108acae8a" containerID="944a5a037aef43d419041f9cbd9d45b6ff6d01b5af4704eedc502755f51c813f" exitCode=1 Apr 22 18:47:04.551909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.551866 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" event={"ID":"e473ff71-80f3-4edf-8096-6fa108acae8a","Type":"ContainerStarted","Data":"fb278fbe4b41b3904d477837b35fb7890f7dbacb828393b4e3297f67650e2b0b"} Apr 22 18:47:04.551909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.551879 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" event={"ID":"e473ff71-80f3-4edf-8096-6fa108acae8a","Type":"ContainerStarted","Data":"07246f4235fa490fefb3c51054acf9dd322e42d956193ec226d5c0f71656b653"} Apr 22 18:47:04.551909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.551888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" event={"ID":"e473ff71-80f3-4edf-8096-6fa108acae8a","Type":"ContainerStarted","Data":"90df1d8e20d761dab9fe13af09c9056afa5cd93ce2c5a96ac1ecc987327e3f0a"} Apr 22 18:47:04.551909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.551897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" event={"ID":"e473ff71-80f3-4edf-8096-6fa108acae8a","Type":"ContainerStarted","Data":"3723e72e189cd21b64c0575cc5f9a546232439434dabee79979511a5672d50cd"} Apr 22 18:47:04.551909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.551908 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" event={"ID":"e473ff71-80f3-4edf-8096-6fa108acae8a","Type":"ContainerDied","Data":"944a5a037aef43d419041f9cbd9d45b6ff6d01b5af4704eedc502755f51c813f"} Apr 22 18:47:04.552049 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.551918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" event={"ID":"e473ff71-80f3-4edf-8096-6fa108acae8a","Type":"ContainerStarted","Data":"cb8e5f845d82bde74aa58328ae76baaba6aa0e42ae9a78716c26540ed59754d1"} Apr 22 18:47:04.559483 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.559447 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rlzcp" podStartSLOduration=4.205170052 podStartE2EDuration="21.559435134s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.130162494 +0000 UTC m=+3.192646808" lastFinishedPulling="2026-04-22 18:47:03.484427577 +0000 UTC m=+20.546911890" observedRunningTime="2026-04-22 18:47:04.559412944 +0000 UTC m=+21.621897274" watchObservedRunningTime="2026-04-22 18:47:04.559435134 +0000 UTC m=+21.621919474" Apr 22 18:47:04.574597 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.574545 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4w69n" podStartSLOduration=4.278534293 podStartE2EDuration="21.574530307s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.123507666 +0000 UTC m=+3.185991978" lastFinishedPulling="2026-04-22 18:47:03.419503684 +0000 UTC m=+20.481987992" observedRunningTime="2026-04-22 18:47:04.57429925 +0000 UTC m=+21.636783580" watchObservedRunningTime="2026-04-22 18:47:04.574530307 +0000 UTC m=+21.637014638" Apr 22 18:47:04.607902 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.607854 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vgg98" podStartSLOduration=4.311560462 podStartE2EDuration="21.60783744s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.123220281 +0000 UTC m=+3.185704605" lastFinishedPulling="2026-04-22 18:47:03.41949726 +0000 UTC m=+20.481981583" observedRunningTime="2026-04-22 18:47:04.587834472 +0000 UTC m=+21.650318813" watchObservedRunningTime="2026-04-22 18:47:04.60783744 +0000 UTC m=+21.670321773" Apr 22 18:47:04.608140 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.608111 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kfn2f" podStartSLOduration=4.281458321 podStartE2EDuration="21.608102054s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.131851981 +0000 UTC m=+3.194336303" lastFinishedPulling="2026-04-22 18:47:03.458495726 +0000 UTC m=+20.520980036" observedRunningTime="2026-04-22 18:47:04.607779324 +0000 UTC m=+21.670263655" watchObservedRunningTime="2026-04-22 18:47:04.608102054 +0000 UTC m=+21.670586385" Apr 22 18:47:04.623440 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.623389 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rqzlh" podStartSLOduration=4.330196916 podStartE2EDuration="21.623374195s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.126320679 +0000 UTC m=+3.188804993" lastFinishedPulling="2026-04-22 18:47:03.419497961 +0000 UTC m=+20.481982272" observedRunningTime="2026-04-22 18:47:04.62334837 +0000 UTC m=+21.685832700" watchObservedRunningTime="2026-04-22 18:47:04.623374195 +0000 UTC m=+21.685858508" Apr 22 18:47:04.782282 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:04.782250 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:47:05.438419 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:05.438291 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:47:04.78227465Z","UUID":"051218f6-ad2b-420e-bcb2-3d98f0a3c778","Handler":null,"Name":"","Endpoint":""} Apr 22 18:47:05.440369 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:05.440331 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:47:05.440369 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:05.440363 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:47:05.465562 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:05.465161 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:05.465562 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:05.465311 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:47:05.465807 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:05.465728 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:05.465869 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:05.465824 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:47:05.465945 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:05.465929 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:05.466017 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:05.465998 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:47:05.555998 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:05.555954 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" event={"ID":"82a7e0c0-777f-4215-8cb4-a68ede452c23","Type":"ContainerStarted","Data":"e8e4c421ec48437d0ade23ef294730d820a01ca37b45b133d6a23e23d6b69f51"} Apr 22 18:47:05.557618 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:05.557572 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rqxnk" event={"ID":"ec7921c7-6ea3-40af-a74d-07b404ff9eb9","Type":"ContainerStarted","Data":"aaa2724228f164cfcd359b0ee20eba74cba1ab3410943c9d0c348e30c213a2f5"} Apr 22 18:47:05.573013 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:05.572956 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rqxnk" podStartSLOduration=5.2331008820000005 podStartE2EDuration="22.572936864s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.121398105 +0000 UTC m=+3.183882417" lastFinishedPulling="2026-04-22 18:47:03.461234083 +0000 UTC m=+20.523718399" observedRunningTime="2026-04-22 18:47:05.572908682 +0000 UTC m=+22.635393025" watchObservedRunningTime="2026-04-22 18:47:05.572936864 +0000 UTC m=+22.635421218" Apr 22 18:47:06.563916 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:06.563686 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 18:47:06.564365 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:06.564279 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" event={"ID":"e473ff71-80f3-4edf-8096-6fa108acae8a","Type":"ContainerStarted","Data":"9bc2295be845a9f3cd77f088582bd3115c9898876afb33b301668117fb566c24"} Apr 22 18:47:06.566591 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:06.566561 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" event={"ID":"82a7e0c0-777f-4215-8cb4-a68ede452c23","Type":"ContainerStarted","Data":"206859806b3e09e2c1f94231ea747e98ec2fc5b3aa52486679a3bd801f624b7e"} Apr 22 18:47:06.585187 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:06.585128 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mv449" podStartSLOduration=3.900549558 podStartE2EDuration="23.585107947s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.128802852 +0000 UTC m=+3.191287167" lastFinishedPulling="2026-04-22 18:47:05.813361232 +0000 UTC m=+22.875845556" observedRunningTime="2026-04-22 18:47:06.584957122 +0000 UTC m=+23.647441467" watchObservedRunningTime="2026-04-22 18:47:06.585107947 +0000 UTC m=+23.647592280" Apr 22 18:47:07.464912 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:07.464873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:07.465124 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:07.464949 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:07.465124 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:07.464977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:07.465258 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:07.465107 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:47:07.465258 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:07.465128 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:47:07.465344 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:07.465249 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:47:09.100100 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.099910 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:47:09.174949 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.174913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:09.175105 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:09.175033 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:09.175105 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:09.175092 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret podName:d1699627-d338-4301-b677-d1b46965f511 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:25.17507612 +0000 UTC m=+42.237560429 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret") pod "global-pull-secret-syncer-bh2v7" (UID: "d1699627-d338-4301-b677-d1b46965f511") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:47:09.404067 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.403982 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:47:09.404635 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.404618 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:47:09.465166 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.465130 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:09.465342 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.465135 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:09.465342 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:09.465268 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:47:09.465342 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.465135 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:09.465342 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:09.465321 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:47:09.465523 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:09.465369 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:47:09.575783 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.575755 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 18:47:09.576115 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.576091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" event={"ID":"e473ff71-80f3-4edf-8096-6fa108acae8a","Type":"ContainerStarted","Data":"892a7063757ad953c57aaa292bf97fa5352f8886336bb0bd083d26d59f395343"} Apr 22 18:47:09.576452 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.576428 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:47:09.576585 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.576570 2575 scope.go:117] "RemoveContainer" containerID="944a5a037aef43d419041f9cbd9d45b6ff6d01b5af4704eedc502755f51c813f" Apr 22 18:47:09.577744 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.577721 2575 generic.go:358] "Generic (PLEG): container finished" podID="46f110eb-3658-4771-b650-29f48ae2842a" containerID="e495801ec6d67eac608804b9ee19732af1a35748cac84db5f1d37dfc4a05857e" exitCode=0 Apr 22 18:47:09.577834 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.577811 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" event={"ID":"46f110eb-3658-4771-b650-29f48ae2842a","Type":"ContainerDied","Data":"e495801ec6d67eac608804b9ee19732af1a35748cac84db5f1d37dfc4a05857e"} Apr 22 18:47:09.578479 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.578454 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vgg98" Apr 22 18:47:09.591832 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:09.591810 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:47:10.514382 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.514348 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bh2v7"] Apr 22 18:47:10.514750 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.514468 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:10.514750 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:10.514543 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:47:10.517147 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.517120 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zn7pv"] Apr 22 18:47:10.517276 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.517261 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:10.517404 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:10.517384 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:47:10.517663 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.517643 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-84n9f"] Apr 22 18:47:10.517762 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.517750 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:10.517924 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:10.517826 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:47:10.583291 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.583194 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 18:47:10.583637 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.583610 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" event={"ID":"e473ff71-80f3-4edf-8096-6fa108acae8a","Type":"ContainerStarted","Data":"a0f525eb4610736437622bfce5a6cf2f1d921f071c2a5b19627745ffd969c150"} Apr 22 18:47:10.583722 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.583707 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:10.583909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.583886 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:47:10.600616 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.600586 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:47:10.614538 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:10.614490 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" podStartSLOduration=10.219543939 podStartE2EDuration="27.614477006s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.131504693 +0000 UTC m=+3.193989017" lastFinishedPulling="2026-04-22 18:47:03.526437766 +0000 UTC m=+20.588922084" observedRunningTime="2026-04-22 18:47:10.613000101 +0000 UTC m=+27.675484461" watchObservedRunningTime="2026-04-22 18:47:10.614477006 +0000 UTC m=+27.676961337" Apr 22 18:47:11.585597 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:11.585546 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:47:12.083973 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:12.083773 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:47:12.465468 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:12.465433 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:12.465468 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:12.465466 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:12.465468 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:12.465478 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:12.465703 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:12.465548 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:47:12.465703 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:12.465605 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:47:12.465703 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:12.465671 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:47:12.589679 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:12.589646 2575 generic.go:358] "Generic (PLEG): container finished" podID="46f110eb-3658-4771-b650-29f48ae2842a" containerID="998b2964fbbcd9a5a501e679a0c4cbec4cf164c84903100db31ef40926cb785e" exitCode=0 Apr 22 18:47:12.590127 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:12.589740 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" event={"ID":"46f110eb-3658-4771-b650-29f48ae2842a","Type":"ContainerDied","Data":"998b2964fbbcd9a5a501e679a0c4cbec4cf164c84903100db31ef40926cb785e"} Apr 22 18:47:13.593768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:13.593675 2575 generic.go:358] "Generic (PLEG): container finished" podID="46f110eb-3658-4771-b650-29f48ae2842a" containerID="b8f7f242eca40b0b19313804265115759343b05ae65d282c73e868d29b3a425b" exitCode=0 Apr 22 18:47:13.594457 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:13.593761 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" event={"ID":"46f110eb-3658-4771-b650-29f48ae2842a","Type":"ContainerDied","Data":"b8f7f242eca40b0b19313804265115759343b05ae65d282c73e868d29b3a425b"} Apr 22 18:47:13.606784 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:13.606735 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" podUID="e473ff71-80f3-4edf-8096-6fa108acae8a" containerName="ovnkube-controller" probeResult="failure" output="" Apr 22 18:47:14.465749 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:14.465712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:14.465897 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:14.465712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:14.465897 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:14.465712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:14.465984 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:14.465900 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:47:14.465984 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:14.465971 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:47:14.466041 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:14.465819 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:47:16.465801 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.465758 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:16.466407 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.465759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:16.466407 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:16.465889 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zn7pv" podUID="16f54576-6941-4246-bcb0-89cfeef13253" Apr 22 18:47:16.466407 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.465759 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:16.466407 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:16.465981 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-84n9f" podUID="f5a171a3-924e-421e-a715-95ec17243358" Apr 22 18:47:16.466407 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:16.466055 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bh2v7" podUID="d1699627-d338-4301-b677-d1b46965f511" Apr 22 18:47:16.785353 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.785323 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-109.ec2.internal" event="NodeReady" Apr 22 18:47:16.785535 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.785472 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:47:16.823923 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.823891 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5xxjt"] Apr 22 18:47:16.827052 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.827023 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lv9p4"] Apr 22 18:47:16.827197 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.827158 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:16.830182 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.830083 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:47:16.830182 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.830085 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctxn4\"" Apr 22 18:47:16.830380 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.830247 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:47:16.830380 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.830296 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:16.833199 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.832952 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:47:16.833199 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.832972 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdfh7\"" Apr 22 18:47:16.833199 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.833008 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:47:16.833199 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.833009 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:47:16.841010 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.840964 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5xxjt"] Apr 22 18:47:16.841808 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.841769 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lv9p4"] Apr 22 18:47:16.931539 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.931499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnvmd\" (UniqueName: \"kubernetes.io/projected/a39d7311-24fb-45df-884c-194983c0905a-kube-api-access-gnvmd\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:16.931539 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.931548 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-tmp-dir\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:16.931781 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.931658 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-config-volume\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:16.931781 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.931691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv8jw\" (UniqueName: \"kubernetes.io/projected/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-kube-api-access-rv8jw\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:16.931781 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.931713 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:16.931781 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:16.931772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:17.032904 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.032869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:17.033101 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.032935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnvmd\" (UniqueName: \"kubernetes.io/projected/a39d7311-24fb-45df-884c-194983c0905a-kube-api-access-gnvmd\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:17.033101 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.033031 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:17.033227 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.033108 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls podName:a708e3d6-d406-4cfd-ab5f-8dd221a9fd88 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.533085249 +0000 UTC m=+34.595569582 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls") pod "dns-default-5xxjt" (UID: "a708e3d6-d406-4cfd-ab5f-8dd221a9fd88") : secret "dns-default-metrics-tls" not found Apr 22 18:47:17.033227 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.033133 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-tmp-dir\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:17.033349 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.033225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-config-volume\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:17.033349 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.033251 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv8jw\" (UniqueName: \"kubernetes.io/projected/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-kube-api-access-rv8jw\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:17.033349 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.033280 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:17.033498 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.033374 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:17.033498 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.033417 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert podName:a39d7311-24fb-45df-884c-194983c0905a nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.533402246 +0000 UTC m=+34.595886557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert") pod "ingress-canary-lv9p4" (UID: "a39d7311-24fb-45df-884c-194983c0905a") : secret "canary-serving-cert" not found Apr 22 18:47:17.033605 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.033564 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-tmp-dir\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:17.033866 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.033847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-config-volume\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:17.044298 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.044235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv8jw\" (UniqueName: \"kubernetes.io/projected/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-kube-api-access-rv8jw\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:17.044441 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.044324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnvmd\" (UniqueName: \"kubernetes.io/projected/a39d7311-24fb-45df-884c-194983c0905a-kube-api-access-gnvmd\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:17.235078 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.235037 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:17.235304 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.235227 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:17.235304 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.235300 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs podName:16f54576-6941-4246-bcb0-89cfeef13253 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:49.235285518 +0000 UTC m=+66.297769827 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs") pod "network-metrics-daemon-zn7pv" (UID: "16f54576-6941-4246-bcb0-89cfeef13253") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:47:17.336434 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.336338 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c44v9\" (UniqueName: \"kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9\") pod \"network-check-target-84n9f\" (UID: \"f5a171a3-924e-421e-a715-95ec17243358\") " pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:17.336605 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.336522 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:47:17.336605 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.336552 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:47:17.336605 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.336566 2575 projected.go:194] Error preparing data for projected volume kube-api-access-c44v9 for pod openshift-network-diagnostics/network-check-target-84n9f: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:17.336760 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.336635 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9 podName:f5a171a3-924e-421e-a715-95ec17243358 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:49.336615358 +0000 UTC m=+66.399099687 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-c44v9" (UniqueName: "kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9") pod "network-check-target-84n9f" (UID: "f5a171a3-924e-421e-a715-95ec17243358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:47:17.537727 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.537685 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:17.538237 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:17.537740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:17.538237 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.537848 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:17.538237 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.537850 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:17.538237 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.537918 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls podName:a708e3d6-d406-4cfd-ab5f-8dd221a9fd88 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:18.537903569 +0000 UTC m=+35.600387877 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls") pod "dns-default-5xxjt" (UID: "a708e3d6-d406-4cfd-ab5f-8dd221a9fd88") : secret "dns-default-metrics-tls" not found Apr 22 18:47:17.538237 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:17.537933 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert podName:a39d7311-24fb-45df-884c-194983c0905a nodeName:}" failed. No retries permitted until 2026-04-22 18:47:18.53792698 +0000 UTC m=+35.600411290 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert") pod "ingress-canary-lv9p4" (UID: "a39d7311-24fb-45df-884c-194983c0905a") : secret "canary-serving-cert" not found Apr 22 18:47:18.464963 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.464921 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:18.465164 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.464921 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:18.465164 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.464921 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:18.469523 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.469486 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:18.469683 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.469533 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:18.469683 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.469669 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9rzz8\"" Apr 22 18:47:18.469800 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.469486 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:18.469928 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.469910 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:47:18.470068 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.469967 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhrzn\"" Apr 22 18:47:18.546938 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.546903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:18.547412 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:18.546966 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:18.547412 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:18.547069 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:18.547412 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:18.547122 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:18.547412 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:18.547148 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert podName:a39d7311-24fb-45df-884c-194983c0905a nodeName:}" failed. No retries permitted until 2026-04-22 18:47:20.547128173 +0000 UTC m=+37.609612486 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert") pod "ingress-canary-lv9p4" (UID: "a39d7311-24fb-45df-884c-194983c0905a") : secret "canary-serving-cert" not found Apr 22 18:47:18.547412 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:18.547170 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls podName:a708e3d6-d406-4cfd-ab5f-8dd221a9fd88 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:20.547157287 +0000 UTC m=+37.609641600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls") pod "dns-default-5xxjt" (UID: "a708e3d6-d406-4cfd-ab5f-8dd221a9fd88") : secret "dns-default-metrics-tls" not found Apr 22 18:47:19.608148 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:19.607965 2575 generic.go:358] "Generic (PLEG): container finished" podID="46f110eb-3658-4771-b650-29f48ae2842a" containerID="a92fa598197fcb9863e84b2bea7c3409fc7121b4acbb2030b73bb93413f8c233" exitCode=0 Apr 22 18:47:19.608576 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:19.608041 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" event={"ID":"46f110eb-3658-4771-b650-29f48ae2842a","Type":"ContainerDied","Data":"a92fa598197fcb9863e84b2bea7c3409fc7121b4acbb2030b73bb93413f8c233"} Apr 22 18:47:20.564261 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:20.564224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:20.564426 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:20.564273 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:20.564426 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:20.564370 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:20.564426 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:20.564373 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:20.564426 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:20.564423 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls podName:a708e3d6-d406-4cfd-ab5f-8dd221a9fd88 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:24.564410015 +0000 UTC m=+41.626894327 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls") pod "dns-default-5xxjt" (UID: "a708e3d6-d406-4cfd-ab5f-8dd221a9fd88") : secret "dns-default-metrics-tls" not found Apr 22 18:47:20.564597 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:20.564436 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert podName:a39d7311-24fb-45df-884c-194983c0905a nodeName:}" failed. No retries permitted until 2026-04-22 18:47:24.564430331 +0000 UTC m=+41.626914640 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert") pod "ingress-canary-lv9p4" (UID: "a39d7311-24fb-45df-884c-194983c0905a") : secret "canary-serving-cert" not found Apr 22 18:47:20.612773 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:20.612740 2575 generic.go:358] "Generic (PLEG): container finished" podID="46f110eb-3658-4771-b650-29f48ae2842a" containerID="a3991de30595a12bf13c02bf78b3491c687704b147d8298c4c51bdec89ccd288" exitCode=0 Apr 22 18:47:20.613130 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:20.612804 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" event={"ID":"46f110eb-3658-4771-b650-29f48ae2842a","Type":"ContainerDied","Data":"a3991de30595a12bf13c02bf78b3491c687704b147d8298c4c51bdec89ccd288"} Apr 22 18:47:21.617915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:21.617875 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" event={"ID":"46f110eb-3658-4771-b650-29f48ae2842a","Type":"ContainerStarted","Data":"02b4cf1026bf80de077977df33238696444c033c64432607ee15b096fe9ac1c7"} Apr 22 18:47:21.644660 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:21.644608 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vdhgw" podStartSLOduration=5.461258688 podStartE2EDuration="38.644590974s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:46:46.127505484 +0000 UTC m=+3.189989813" lastFinishedPulling="2026-04-22 18:47:19.310837778 +0000 UTC m=+36.373322099" observedRunningTime="2026-04-22 18:47:21.644339138 +0000 UTC m=+38.706823468" watchObservedRunningTime="2026-04-22 18:47:21.644590974 +0000 UTC m=+38.707075305" Apr 22 18:47:24.593781 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:24.593741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:24.594203 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:24.593826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:24.594203 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:24.593881 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:24.594203 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:24.593904 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:24.594203 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:24.593950 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert podName:a39d7311-24fb-45df-884c-194983c0905a nodeName:}" failed. No retries permitted until 2026-04-22 18:47:32.593936397 +0000 UTC m=+49.656420707 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert") pod "ingress-canary-lv9p4" (UID: "a39d7311-24fb-45df-884c-194983c0905a") : secret "canary-serving-cert" not found Apr 22 18:47:24.594203 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:24.593962 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls podName:a708e3d6-d406-4cfd-ab5f-8dd221a9fd88 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:32.593956468 +0000 UTC m=+49.656440777 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls") pod "dns-default-5xxjt" (UID: "a708e3d6-d406-4cfd-ab5f-8dd221a9fd88") : secret "dns-default-metrics-tls" not found Apr 22 18:47:25.199474 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:25.199434 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:25.202887 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:25.202860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/d1699627-d338-4301-b677-d1b46965f511-original-pull-secret\") pod \"global-pull-secret-syncer-bh2v7\" (UID: \"d1699627-d338-4301-b677-d1b46965f511\") " pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:25.378616 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:25.378567 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bh2v7" Apr 22 18:47:25.534898 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:25.534855 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bh2v7"] Apr 22 18:47:25.538190 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:47:25.538161 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1699627_d338_4301_b677_d1b46965f511.slice/crio-62cf1f7b3013daa07527a9039246a7550363dc68e9d9efdb7a4753322dcd3c21 WatchSource:0}: Error finding container 62cf1f7b3013daa07527a9039246a7550363dc68e9d9efdb7a4753322dcd3c21: Status 404 returned error can't find the container with id 62cf1f7b3013daa07527a9039246a7550363dc68e9d9efdb7a4753322dcd3c21 Apr 22 18:47:25.628866 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:25.628829 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bh2v7" event={"ID":"d1699627-d338-4301-b677-d1b46965f511","Type":"ContainerStarted","Data":"62cf1f7b3013daa07527a9039246a7550363dc68e9d9efdb7a4753322dcd3c21"} Apr 22 18:47:29.638275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:29.638241 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bh2v7" event={"ID":"d1699627-d338-4301-b677-d1b46965f511","Type":"ContainerStarted","Data":"b6d7d25ba82107ef5205483ce4efcf72e072e31956c3922912657967aef87614"} Apr 22 18:47:30.654199 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:30.654150 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bh2v7" podStartSLOduration=33.740407502 podStartE2EDuration="37.654136115s" podCreationTimestamp="2026-04-22 18:46:53 +0000 UTC" firstStartedPulling="2026-04-22 18:47:25.539901064 +0000 UTC m=+42.602385387" lastFinishedPulling="2026-04-22 18:47:29.453629688 +0000 UTC m=+46.516114000" observedRunningTime="2026-04-22 18:47:30.653469461 +0000 UTC m=+47.715953802" watchObservedRunningTime="2026-04-22 18:47:30.654136115 +0000 UTC m=+47.716620445" Apr 22 18:47:32.653050 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:32.653008 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:32.653050 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:32.653058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:32.653580 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:32.653149 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:32.653580 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:32.653172 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:32.653580 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:32.653228 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls podName:a708e3d6-d406-4cfd-ab5f-8dd221a9fd88 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:48.653194692 +0000 UTC m=+65.715679001 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls") pod "dns-default-5xxjt" (UID: "a708e3d6-d406-4cfd-ab5f-8dd221a9fd88") : secret "dns-default-metrics-tls" not found Apr 22 18:47:32.653580 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:32.653267 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert podName:a39d7311-24fb-45df-884c-194983c0905a nodeName:}" failed. No retries permitted until 2026-04-22 18:47:48.653246701 +0000 UTC m=+65.715731021 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert") pod "ingress-canary-lv9p4" (UID: "a39d7311-24fb-45df-884c-194983c0905a") : secret "canary-serving-cert" not found Apr 22 18:47:43.603758 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:43.603728 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p7bgj" Apr 22 18:47:48.668021 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:48.667978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:47:48.668021 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:48.668031 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:47:48.668565 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:48.668126 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:48.668565 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:48.668130 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:48.668565 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:48.668186 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls podName:a708e3d6-d406-4cfd-ab5f-8dd221a9fd88 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:20.668168984 +0000 UTC m=+97.730653292 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls") pod "dns-default-5xxjt" (UID: "a708e3d6-d406-4cfd-ab5f-8dd221a9fd88") : secret "dns-default-metrics-tls" not found Apr 22 18:47:48.668565 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:48.668201 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert podName:a39d7311-24fb-45df-884c-194983c0905a nodeName:}" failed. No retries permitted until 2026-04-22 18:48:20.668195165 +0000 UTC m=+97.730679474 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert") pod "ingress-canary-lv9p4" (UID: "a39d7311-24fb-45df-884c-194983c0905a") : secret "canary-serving-cert" not found Apr 22 18:47:49.273055 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:49.273013 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:47:49.275770 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:49.275740 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:49.284014 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:49.283990 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:49.284129 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:47:49.284055 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs podName:16f54576-6941-4246-bcb0-89cfeef13253 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:53.284037113 +0000 UTC m=+130.346521423 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs") pod "network-metrics-daemon-zn7pv" (UID: "16f54576-6941-4246-bcb0-89cfeef13253") : secret "metrics-daemon-secret" not found Apr 22 18:47:49.373514 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:49.373480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c44v9\" (UniqueName: \"kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9\") pod \"network-check-target-84n9f\" (UID: \"f5a171a3-924e-421e-a715-95ec17243358\") " pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:49.376414 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:49.376397 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:49.387537 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:49.387516 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:49.415047 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:49.415019 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c44v9\" (UniqueName: \"kubernetes.io/projected/f5a171a3-924e-421e-a715-95ec17243358-kube-api-access-c44v9\") pod \"network-check-target-84n9f\" (UID: \"f5a171a3-924e-421e-a715-95ec17243358\") " pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:49.702233 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:49.702135 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhrzn\"" Apr 22 18:47:49.709336 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:49.709314 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:49.838674 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:49.838565 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-84n9f"] Apr 22 18:47:49.840051 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:47:49.840004 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5a171a3_924e_421e_a715_95ec17243358.slice/crio-288d3f61059329c1cc170bf41328c6dd071d39a0e80ea9b3bd8a90cb54300d9f WatchSource:0}: Error finding container 288d3f61059329c1cc170bf41328c6dd071d39a0e80ea9b3bd8a90cb54300d9f: Status 404 returned error can't find the container with id 288d3f61059329c1cc170bf41328c6dd071d39a0e80ea9b3bd8a90cb54300d9f Apr 22 18:47:50.678770 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:50.678708 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-84n9f" event={"ID":"f5a171a3-924e-421e-a715-95ec17243358","Type":"ContainerStarted","Data":"288d3f61059329c1cc170bf41328c6dd071d39a0e80ea9b3bd8a90cb54300d9f"} Apr 22 18:47:52.684605 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:52.684570 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-84n9f" event={"ID":"f5a171a3-924e-421e-a715-95ec17243358","Type":"ContainerStarted","Data":"53cf742c9a6da56fdec9793734f34f10bce7bb61d76c783a75a696556fc03839"} Apr 22 18:47:52.684955 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:52.684808 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:47:52.699075 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:47:52.699025 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-84n9f" podStartSLOduration=66.953327457 podStartE2EDuration="1m9.699010279s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:47:49.841944999 +0000 UTC m=+66.904429308" lastFinishedPulling="2026-04-22 18:47:52.587627818 +0000 UTC m=+69.650112130" observedRunningTime="2026-04-22 18:47:52.69811097 +0000 UTC m=+69.760595302" watchObservedRunningTime="2026-04-22 18:47:52.699010279 +0000 UTC m=+69.761494588" Apr 22 18:48:20.693575 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:20.693526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:48:20.693575 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:20.693579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:48:20.694050 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:20.693672 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:48:20.694050 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:20.693676 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:48:20.694050 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:20.693734 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls podName:a708e3d6-d406-4cfd-ab5f-8dd221a9fd88 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:24.693720016 +0000 UTC m=+161.756204325 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls") pod "dns-default-5xxjt" (UID: "a708e3d6-d406-4cfd-ab5f-8dd221a9fd88") : secret "dns-default-metrics-tls" not found Apr 22 18:48:20.694050 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:20.693747 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert podName:a39d7311-24fb-45df-884c-194983c0905a nodeName:}" failed. No retries permitted until 2026-04-22 18:49:24.693741484 +0000 UTC m=+161.756225794 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert") pod "ingress-canary-lv9p4" (UID: "a39d7311-24fb-45df-884c-194983c0905a") : secret "canary-serving-cert" not found Apr 22 18:48:22.379264 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.379229 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wprxw"] Apr 22 18:48:22.381575 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.381557 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.384340 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.384316 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:48:22.384445 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.384318 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:48:22.384445 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.384405 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:48:22.385430 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.385415 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-582k6\"" Apr 22 18:48:22.385510 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.385444 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:48:22.390544 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.390517 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:48:22.390998 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.390980 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wprxw"] Apr 22 18:48:22.505575 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.505534 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-serving-cert\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.505750 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.505589 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-snapshots\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.505750 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.505609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-service-ca-bundle\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.505750 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.505663 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-tmp\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.505750 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.505680 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl9cd\" (UniqueName: \"kubernetes.io/projected/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-kube-api-access-fl9cd\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.505750 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.505729 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.606146 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.606114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-snapshots\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.606245 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.606150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-service-ca-bundle\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.606245 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.606176 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-tmp\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.606245 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.606199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl9cd\" (UniqueName: \"kubernetes.io/projected/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-kube-api-access-fl9cd\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.606414 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.606382 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.606514 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.606493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-serving-cert\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.606833 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.606811 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-tmp\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.606913 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.606816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-service-ca-bundle\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.606913 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.606900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-snapshots\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.607174 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.607153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.608743 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.608724 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-serving-cert\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.614845 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.614822 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl9cd\" (UniqueName: \"kubernetes.io/projected/c13de4f5-3c84-460a-9d6a-f2c326e0eef9-kube-api-access-fl9cd\") pod \"insights-operator-585dfdc468-wprxw\" (UID: \"c13de4f5-3c84-460a-9d6a-f2c326e0eef9\") " pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.690500 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.690412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-wprxw" Apr 22 18:48:22.803768 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:22.803721 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-wprxw"] Apr 22 18:48:22.808517 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:48:22.808490 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13de4f5_3c84_460a_9d6a_f2c326e0eef9.slice/crio-6f9e155e10dd99f6af11a8a394538983264d11dc6684d212348b920d58d36c81 WatchSource:0}: Error finding container 6f9e155e10dd99f6af11a8a394538983264d11dc6684d212348b920d58d36c81: Status 404 returned error can't find the container with id 6f9e155e10dd99f6af11a8a394538983264d11dc6684d212348b920d58d36c81 Apr 22 18:48:23.689099 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:23.689064 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-84n9f" Apr 22 18:48:23.742382 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:23.742325 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wprxw" event={"ID":"c13de4f5-3c84-460a-9d6a-f2c326e0eef9","Type":"ContainerStarted","Data":"6f9e155e10dd99f6af11a8a394538983264d11dc6684d212348b920d58d36c81"} Apr 22 18:48:24.745385 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:24.745350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wprxw" event={"ID":"c13de4f5-3c84-460a-9d6a-f2c326e0eef9","Type":"ContainerStarted","Data":"702463e12cfbef9707673622a335433d02b094f70e7b9f76e169dcc6a1b0b287"} Apr 22 18:48:24.760915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:24.760867 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-wprxw" podStartSLOduration=0.954552038 podStartE2EDuration="2.760852086s" podCreationTimestamp="2026-04-22 18:48:22 +0000 UTC" firstStartedPulling="2026-04-22 18:48:22.810419724 +0000 UTC m=+99.872904032" lastFinishedPulling="2026-04-22 18:48:24.616719771 +0000 UTC m=+101.679204080" observedRunningTime="2026-04-22 18:48:24.76012066 +0000 UTC m=+101.822604992" watchObservedRunningTime="2026-04-22 18:48:24.760852086 +0000 UTC m=+101.823336495" Apr 22 18:48:28.945503 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:28.945472 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kfn2f_af1a0441-5fdc-4aa3-ac3d-26b54d430c2b/dns-node-resolver/0.log" Apr 22 18:48:29.746292 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:29.746259 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rqzlh_ea1be31b-74a3-48cb-b181-97ff279b206a/node-ca/0.log" Apr 22 18:48:32.327053 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.327021 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg"] Apr 22 18:48:32.329942 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.329926 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.332437 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.332418 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:48:32.332591 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.332572 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:32.332733 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.332715 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:48:32.333760 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.333740 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:48:32.333819 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.333750 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-74wdj\"" Apr 22 18:48:32.339773 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.339749 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg"] Apr 22 18:48:32.475943 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.475905 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzdj4\" (UniqueName: \"kubernetes.io/projected/b9d9c3b2-8e29-4384-acf5-23ae2b9445a0-kube-api-access-hzdj4\") pod \"service-ca-operator-d6fc45fc5-vnbdg\" (UID: \"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.476087 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.475961 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d9c3b2-8e29-4384-acf5-23ae2b9445a0-config\") pod \"service-ca-operator-d6fc45fc5-vnbdg\" (UID: \"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.476087 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.476036 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9d9c3b2-8e29-4384-acf5-23ae2b9445a0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vnbdg\" (UID: \"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.577048 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.577012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9d9c3b2-8e29-4384-acf5-23ae2b9445a0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vnbdg\" (UID: \"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.577289 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.577074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzdj4\" (UniqueName: \"kubernetes.io/projected/b9d9c3b2-8e29-4384-acf5-23ae2b9445a0-kube-api-access-hzdj4\") pod \"service-ca-operator-d6fc45fc5-vnbdg\" (UID: \"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.577289 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.577130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d9c3b2-8e29-4384-acf5-23ae2b9445a0-config\") pod \"service-ca-operator-d6fc45fc5-vnbdg\" (UID: \"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.577944 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.577920 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d9c3b2-8e29-4384-acf5-23ae2b9445a0-config\") pod \"service-ca-operator-d6fc45fc5-vnbdg\" (UID: \"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.579448 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.579424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9d9c3b2-8e29-4384-acf5-23ae2b9445a0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-vnbdg\" (UID: \"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.585195 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.585162 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzdj4\" (UniqueName: \"kubernetes.io/projected/b9d9c3b2-8e29-4384-acf5-23ae2b9445a0-kube-api-access-hzdj4\") pod \"service-ca-operator-d6fc45fc5-vnbdg\" (UID: \"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.639394 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.639344 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" Apr 22 18:48:32.755100 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.755073 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg"] Apr 22 18:48:32.758350 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:48:32.758319 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9d9c3b2_8e29_4384_acf5_23ae2b9445a0.slice/crio-96a9abfd266b8957f9890da9f067c4560d2c1d488e29231055c8ceb5807f5c05 WatchSource:0}: Error finding container 96a9abfd266b8957f9890da9f067c4560d2c1d488e29231055c8ceb5807f5c05: Status 404 returned error can't find the container with id 96a9abfd266b8957f9890da9f067c4560d2c1d488e29231055c8ceb5807f5c05 Apr 22 18:48:32.761492 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:32.761447 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" event={"ID":"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0","Type":"ContainerStarted","Data":"96a9abfd266b8957f9890da9f067c4560d2c1d488e29231055c8ceb5807f5c05"} Apr 22 18:48:35.768757 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.768718 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" event={"ID":"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0","Type":"ContainerStarted","Data":"3c7571c11f8d5f85cbdcef160cf29ba841d00f2ac5d4c093e776c1e8cace3bb5"} Apr 22 18:48:35.783474 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.783427 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" podStartSLOduration=1.774698474 podStartE2EDuration="3.783411491s" podCreationTimestamp="2026-04-22 18:48:32 +0000 UTC" firstStartedPulling="2026-04-22 18:48:32.760200071 +0000 UTC m=+109.822684381" lastFinishedPulling="2026-04-22 18:48:34.768913073 +0000 UTC m=+111.831397398" observedRunningTime="2026-04-22 18:48:35.782932778 +0000 UTC m=+112.845417108" watchObservedRunningTime="2026-04-22 18:48:35.783411491 +0000 UTC m=+112.845895824" Apr 22 18:48:35.831383 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.831340 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-794b4685b8-4n2n9"] Apr 22 18:48:35.834288 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.834272 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:35.836924 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.836900 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8zvm4\"" Apr 22 18:48:35.837045 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.836900 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:48:35.837045 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.836903 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:48:35.837045 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.836947 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:48:35.842580 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.842559 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:48:35.842942 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.842922 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-794b4685b8-4n2n9"] Apr 22 18:48:35.999221 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.999181 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:35.999404 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.999247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-certificates\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:35.999404 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.999313 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-installation-pull-secrets\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:35.999497 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.999396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3d6b355-22f1-4db5-a332-667de73ccfa1-ca-trust-extracted\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:35.999497 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.999426 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-image-registry-private-configuration\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:35.999497 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.999445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrmps\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-kube-api-access-jrmps\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:35.999497 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.999475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-trusted-ca\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:35.999497 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:35.999493 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-bound-sa-token\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.100791 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.100757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3d6b355-22f1-4db5-a332-667de73ccfa1-ca-trust-extracted\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.100791 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.100795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-image-registry-private-configuration\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.101041 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.100813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrmps\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-kube-api-access-jrmps\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.101041 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.100833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-trusted-ca\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.101041 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.100849 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-bound-sa-token\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.101041 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.100892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.101041 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.100916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-certificates\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.101041 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.100982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-installation-pull-secrets\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.101353 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:36.101116 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:36.101353 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:36.101141 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-794b4685b8-4n2n9: secret "image-registry-tls" not found Apr 22 18:48:36.105660 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:36.101545 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls podName:b3d6b355-22f1-4db5-a332-667de73ccfa1 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:36.601516655 +0000 UTC m=+113.664000970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls") pod "image-registry-794b4685b8-4n2n9" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1") : secret "image-registry-tls" not found Apr 22 18:48:36.105660 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.101914 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3d6b355-22f1-4db5-a332-667de73ccfa1-ca-trust-extracted\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.105660 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.102179 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-certificates\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.105660 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.102430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-trusted-ca\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.106722 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.106697 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-installation-pull-secrets\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.108252 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.108230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-image-registry-private-configuration\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.111578 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.111555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-bound-sa-token\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.111906 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.111885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrmps\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-kube-api-access-jrmps\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.407668 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.407579 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl"] Apr 22 18:48:36.411832 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.411816 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl" Apr 22 18:48:36.415112 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.415090 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:48:36.416169 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.416151 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-5xxql\"" Apr 22 18:48:36.416288 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.416194 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:48:36.420693 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.420669 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl"] Apr 22 18:48:36.505408 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.505373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clkrc\" (UniqueName: \"kubernetes.io/projected/97cb2698-cc23-43ca-8160-d8f09f48ade2-kube-api-access-clkrc\") pod \"migrator-74bb7799d9-l97zl\" (UID: \"97cb2698-cc23-43ca-8160-d8f09f48ade2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl" Apr 22 18:48:36.606294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.606252 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:36.606497 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.606314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clkrc\" (UniqueName: \"kubernetes.io/projected/97cb2698-cc23-43ca-8160-d8f09f48ade2-kube-api-access-clkrc\") pod \"migrator-74bb7799d9-l97zl\" (UID: \"97cb2698-cc23-43ca-8160-d8f09f48ade2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl" Apr 22 18:48:36.606497 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:36.606407 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:36.606497 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:36.606435 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-794b4685b8-4n2n9: secret "image-registry-tls" not found Apr 22 18:48:36.606497 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:36.606489 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls podName:b3d6b355-22f1-4db5-a332-667de73ccfa1 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:37.6064735 +0000 UTC m=+114.668957808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls") pod "image-registry-794b4685b8-4n2n9" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1") : secret "image-registry-tls" not found Apr 22 18:48:36.613822 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.613803 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clkrc\" (UniqueName: \"kubernetes.io/projected/97cb2698-cc23-43ca-8160-d8f09f48ade2-kube-api-access-clkrc\") pod \"migrator-74bb7799d9-l97zl\" (UID: \"97cb2698-cc23-43ca-8160-d8f09f48ade2\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl" Apr 22 18:48:36.721348 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.721255 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl" Apr 22 18:48:36.839794 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:36.839753 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl"] Apr 22 18:48:36.843413 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:48:36.843386 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97cb2698_cc23_43ca_8160_d8f09f48ade2.slice/crio-50c4cece5157eac158834e5987da831ad0f94bcbdb2eef979b5991fefddbcb07 WatchSource:0}: Error finding container 50c4cece5157eac158834e5987da831ad0f94bcbdb2eef979b5991fefddbcb07: Status 404 returned error can't find the container with id 50c4cece5157eac158834e5987da831ad0f94bcbdb2eef979b5991fefddbcb07 Apr 22 18:48:37.614060 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:37.614021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:37.614346 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:37.614176 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:37.614346 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:37.614197 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-794b4685b8-4n2n9: secret "image-registry-tls" not found Apr 22 18:48:37.614346 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:37.614277 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls podName:b3d6b355-22f1-4db5-a332-667de73ccfa1 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:39.614261591 +0000 UTC m=+116.676745900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls") pod "image-registry-794b4685b8-4n2n9" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1") : secret "image-registry-tls" not found Apr 22 18:48:37.774060 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:37.774016 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl" event={"ID":"97cb2698-cc23-43ca-8160-d8f09f48ade2","Type":"ContainerStarted","Data":"50c4cece5157eac158834e5987da831ad0f94bcbdb2eef979b5991fefddbcb07"} Apr 22 18:48:38.778315 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:38.778283 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl" event={"ID":"97cb2698-cc23-43ca-8160-d8f09f48ade2","Type":"ContainerStarted","Data":"68413c29398ab212e97ef98ae8d653f07ccce8607b7b748cb8916e7955a40d58"} Apr 22 18:48:38.778315 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:38.778318 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl" event={"ID":"97cb2698-cc23-43ca-8160-d8f09f48ade2","Type":"ContainerStarted","Data":"6e036e1e6c948f53dcd1fa1bbd0a5333d070462267ffb67ecac615dfcef86e03"} Apr 22 18:48:38.792238 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:38.792172 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-l97zl" podStartSLOduration=1.688356292 podStartE2EDuration="2.792155761s" podCreationTimestamp="2026-04-22 18:48:36 +0000 UTC" firstStartedPulling="2026-04-22 18:48:36.845197 +0000 UTC m=+113.907681310" lastFinishedPulling="2026-04-22 18:48:37.948996468 +0000 UTC m=+115.011480779" observedRunningTime="2026-04-22 18:48:38.791804841 +0000 UTC m=+115.854289171" watchObservedRunningTime="2026-04-22 18:48:38.792155761 +0000 UTC m=+115.854640091" Apr 22 18:48:39.121368 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.121287 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kbg4x"] Apr 22 18:48:39.124233 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.124199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.126752 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.126725 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:48:39.126878 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.126728 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:48:39.126878 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.126816 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:48:39.126878 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.126826 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:48:39.127993 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.127974 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-lt2v5\"" Apr 22 18:48:39.132792 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.132772 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kbg4x"] Apr 22 18:48:39.227403 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.227362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8522s\" (UniqueName: \"kubernetes.io/projected/eef9d1d2-ec8e-428b-82e8-179de7ddefde-kube-api-access-8522s\") pod \"service-ca-865cb79987-kbg4x\" (UID: \"eef9d1d2-ec8e-428b-82e8-179de7ddefde\") " pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.227557 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.227445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eef9d1d2-ec8e-428b-82e8-179de7ddefde-signing-cabundle\") pod \"service-ca-865cb79987-kbg4x\" (UID: \"eef9d1d2-ec8e-428b-82e8-179de7ddefde\") " pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.227557 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.227503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eef9d1d2-ec8e-428b-82e8-179de7ddefde-signing-key\") pod \"service-ca-865cb79987-kbg4x\" (UID: \"eef9d1d2-ec8e-428b-82e8-179de7ddefde\") " pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.328626 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.328588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eef9d1d2-ec8e-428b-82e8-179de7ddefde-signing-key\") pod \"service-ca-865cb79987-kbg4x\" (UID: \"eef9d1d2-ec8e-428b-82e8-179de7ddefde\") " pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.328796 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.328645 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8522s\" (UniqueName: \"kubernetes.io/projected/eef9d1d2-ec8e-428b-82e8-179de7ddefde-kube-api-access-8522s\") pod \"service-ca-865cb79987-kbg4x\" (UID: \"eef9d1d2-ec8e-428b-82e8-179de7ddefde\") " pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.328796 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.328690 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eef9d1d2-ec8e-428b-82e8-179de7ddefde-signing-cabundle\") pod \"service-ca-865cb79987-kbg4x\" (UID: \"eef9d1d2-ec8e-428b-82e8-179de7ddefde\") " pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.329425 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.329406 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/eef9d1d2-ec8e-428b-82e8-179de7ddefde-signing-cabundle\") pod \"service-ca-865cb79987-kbg4x\" (UID: \"eef9d1d2-ec8e-428b-82e8-179de7ddefde\") " pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.331012 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.330989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/eef9d1d2-ec8e-428b-82e8-179de7ddefde-signing-key\") pod \"service-ca-865cb79987-kbg4x\" (UID: \"eef9d1d2-ec8e-428b-82e8-179de7ddefde\") " pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.337098 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.337074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8522s\" (UniqueName: \"kubernetes.io/projected/eef9d1d2-ec8e-428b-82e8-179de7ddefde-kube-api-access-8522s\") pod \"service-ca-865cb79987-kbg4x\" (UID: \"eef9d1d2-ec8e-428b-82e8-179de7ddefde\") " pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.433403 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.433027 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kbg4x" Apr 22 18:48:39.547668 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.547635 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kbg4x"] Apr 22 18:48:39.550698 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:48:39.550665 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeef9d1d2_ec8e_428b_82e8_179de7ddefde.slice/crio-8f445a267c2aaeacea5483213d7ec45951bd26e02518840350048e390b48c3eb WatchSource:0}: Error finding container 8f445a267c2aaeacea5483213d7ec45951bd26e02518840350048e390b48c3eb: Status 404 returned error can't find the container with id 8f445a267c2aaeacea5483213d7ec45951bd26e02518840350048e390b48c3eb Apr 22 18:48:39.631579 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.631545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:39.631779 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:39.631699 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:39.631779 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:39.631720 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-794b4685b8-4n2n9: secret "image-registry-tls" not found Apr 22 18:48:39.631779 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:39.631774 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls podName:b3d6b355-22f1-4db5-a332-667de73ccfa1 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:43.631756805 +0000 UTC m=+120.694241114 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls") pod "image-registry-794b4685b8-4n2n9" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1") : secret "image-registry-tls" not found Apr 22 18:48:39.783152 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.783113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kbg4x" event={"ID":"eef9d1d2-ec8e-428b-82e8-179de7ddefde","Type":"ContainerStarted","Data":"07433652ccd72aeed0b96fac579a0a8a0c2703bc37891d8b9a1cb8f8360d6eac"} Apr 22 18:48:39.783152 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.783153 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kbg4x" event={"ID":"eef9d1d2-ec8e-428b-82e8-179de7ddefde","Type":"ContainerStarted","Data":"8f445a267c2aaeacea5483213d7ec45951bd26e02518840350048e390b48c3eb"} Apr 22 18:48:39.799301 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:39.799250 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-kbg4x" podStartSLOduration=0.799231886 podStartE2EDuration="799.231886ms" podCreationTimestamp="2026-04-22 18:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:39.798866404 +0000 UTC m=+116.861350736" watchObservedRunningTime="2026-04-22 18:48:39.799231886 +0000 UTC m=+116.861716217" Apr 22 18:48:43.666625 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:43.666584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:43.667011 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:43.666703 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:43.667011 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:43.666715 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-794b4685b8-4n2n9: secret "image-registry-tls" not found Apr 22 18:48:43.667011 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:48:43.666764 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls podName:b3d6b355-22f1-4db5-a332-667de73ccfa1 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:51.666749466 +0000 UTC m=+128.729233774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls") pod "image-registry-794b4685b8-4n2n9" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1") : secret "image-registry-tls" not found Apr 22 18:48:51.731584 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:51.731543 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:51.734018 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:51.733995 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls\") pod \"image-registry-794b4685b8-4n2n9\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:51.746307 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:51.746284 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8zvm4\"" Apr 22 18:48:51.754345 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:51.754324 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:51.879479 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:51.879445 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-794b4685b8-4n2n9"] Apr 22 18:48:51.882614 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:48:51.882576 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d6b355_22f1_4db5_a332_667de73ccfa1.slice/crio-aea67e27e267fe66c378ae700848707c3783f41ed895bf309395f24783bc7970 WatchSource:0}: Error finding container aea67e27e267fe66c378ae700848707c3783f41ed895bf309395f24783bc7970: Status 404 returned error can't find the container with id aea67e27e267fe66c378ae700848707c3783f41ed895bf309395f24783bc7970 Apr 22 18:48:52.818117 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:52.818085 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" event={"ID":"b3d6b355-22f1-4db5-a332-667de73ccfa1","Type":"ContainerStarted","Data":"233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6"} Apr 22 18:48:52.818117 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:52.818118 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" event={"ID":"b3d6b355-22f1-4db5-a332-667de73ccfa1","Type":"ContainerStarted","Data":"aea67e27e267fe66c378ae700848707c3783f41ed895bf309395f24783bc7970"} Apr 22 18:48:52.818563 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:52.818215 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:48:52.838412 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:52.838368 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" podStartSLOduration=17.838350714 podStartE2EDuration="17.838350714s" podCreationTimestamp="2026-04-22 18:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:48:52.837774345 +0000 UTC m=+129.900258675" watchObservedRunningTime="2026-04-22 18:48:52.838350714 +0000 UTC m=+129.900835044" Apr 22 18:48:53.344650 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:53.344609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:48:53.347449 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:53.347428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/16f54576-6941-4246-bcb0-89cfeef13253-metrics-certs\") pod \"network-metrics-daemon-zn7pv\" (UID: \"16f54576-6941-4246-bcb0-89cfeef13253\") " pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:48:53.595867 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:53.595789 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9rzz8\"" Apr 22 18:48:53.603999 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:53.603977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zn7pv" Apr 22 18:48:53.722929 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:53.722897 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zn7pv"] Apr 22 18:48:53.726299 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:48:53.726267 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f54576_6941_4246_bcb0_89cfeef13253.slice/crio-367b792b9e42b52ad28b9fa07fff91356d2587d6365dbb942a756afe8280cb6f WatchSource:0}: Error finding container 367b792b9e42b52ad28b9fa07fff91356d2587d6365dbb942a756afe8280cb6f: Status 404 returned error can't find the container with id 367b792b9e42b52ad28b9fa07fff91356d2587d6365dbb942a756afe8280cb6f Apr 22 18:48:53.821395 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:53.821355 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zn7pv" event={"ID":"16f54576-6941-4246-bcb0-89cfeef13253","Type":"ContainerStarted","Data":"367b792b9e42b52ad28b9fa07fff91356d2587d6365dbb942a756afe8280cb6f"} Apr 22 18:48:55.828911 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:55.828874 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zn7pv" event={"ID":"16f54576-6941-4246-bcb0-89cfeef13253","Type":"ContainerStarted","Data":"0197c80b6cc36546a5ed81532dd6589c1510e6b6b1c7c2f1d36d81a35adf2b73"} Apr 22 18:48:55.828911 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:55.828915 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zn7pv" event={"ID":"16f54576-6941-4246-bcb0-89cfeef13253","Type":"ContainerStarted","Data":"01b577ca8367f545c35320c7cdf2635a1733443db131cc022fa2f25e7b96db4d"} Apr 22 18:48:55.845676 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:48:55.845619 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zn7pv" podStartSLOduration=131.817228367 podStartE2EDuration="2m12.845599801s" podCreationTimestamp="2026-04-22 18:46:43 +0000 UTC" firstStartedPulling="2026-04-22 18:48:53.727971218 +0000 UTC m=+130.790455530" lastFinishedPulling="2026-04-22 18:48:54.756342653 +0000 UTC m=+131.818826964" observedRunningTime="2026-04-22 18:48:55.844903163 +0000 UTC m=+132.907387496" watchObservedRunningTime="2026-04-22 18:48:55.845599801 +0000 UTC m=+132.908084133" Apr 22 18:49:01.954327 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:01.954267 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg"] Apr 22 18:49:01.955975 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:01.955960 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" Apr 22 18:49:01.958697 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:01.958671 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gps8x\"" Apr 22 18:49:01.959928 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:01.959908 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:49:01.960023 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:01.959949 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:49:01.966278 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:01.966259 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg"] Apr 22 18:49:02.012276 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.012232 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-c7qdg\" (UID: \"7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" Apr 22 18:49:02.012462 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.012324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c7qdg\" (UID: \"7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" Apr 22 18:49:02.019307 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.019273 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-794b4685b8-4n2n9"] Apr 22 18:49:02.055270 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.055238 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7"] Apr 22 18:49:02.057044 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.057026 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rdfsn"] Apr 22 18:49:02.057235 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.057197 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7" Apr 22 18:49:02.058944 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.058924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.060343 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.060321 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-ldsvz\"" Apr 22 18:49:02.060744 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.060731 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 18:49:02.061196 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.061177 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6dkrn\"" Apr 22 18:49:02.061620 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.061595 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:49:02.062630 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.062610 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:49:02.070680 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.070658 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7"] Apr 22 18:49:02.092643 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.092612 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rdfsn"] Apr 22 18:49:02.113436 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.113405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b6b7df6c-540d-4973-8f17-dd5152793680-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.113601 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.113448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/42404b27-22f2-4667-b5b0-82e7fbc1840e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-g7dg7\" (UID: \"42404b27-22f2-4667-b5b0-82e7fbc1840e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7" Apr 22 18:49:02.113601 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.113481 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b6b7df6c-540d-4973-8f17-dd5152793680-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.113601 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.113515 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b6b7df6c-540d-4973-8f17-dd5152793680-crio-socket\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.113601 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.113536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-c7qdg\" (UID: \"7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" Apr 22 18:49:02.113601 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.113559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c7qdg\" (UID: \"7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" Apr 22 18:49:02.113601 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.113576 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwtcg\" (UniqueName: \"kubernetes.io/projected/b6b7df6c-540d-4973-8f17-dd5152793680-kube-api-access-rwtcg\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.113928 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.113664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b6b7df6c-540d-4973-8f17-dd5152793680-data-volume\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.114274 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.114253 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-c7qdg\" (UID: \"7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" Apr 22 18:49:02.116036 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.116014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-c7qdg\" (UID: \"7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" Apr 22 18:49:02.214867 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.214781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b6b7df6c-540d-4973-8f17-dd5152793680-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.214867 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.214820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/42404b27-22f2-4667-b5b0-82e7fbc1840e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-g7dg7\" (UID: \"42404b27-22f2-4667-b5b0-82e7fbc1840e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7" Apr 22 18:49:02.215081 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.214882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b6b7df6c-540d-4973-8f17-dd5152793680-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.215081 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.214947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b6b7df6c-540d-4973-8f17-dd5152793680-crio-socket\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.215081 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.214986 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwtcg\" (UniqueName: \"kubernetes.io/projected/b6b7df6c-540d-4973-8f17-dd5152793680-kube-api-access-rwtcg\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.215081 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.215011 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b6b7df6c-540d-4973-8f17-dd5152793680-data-volume\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.215297 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.215081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b6b7df6c-540d-4973-8f17-dd5152793680-crio-socket\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.215380 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.215362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b6b7df6c-540d-4973-8f17-dd5152793680-data-volume\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.215461 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.215430 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b6b7df6c-540d-4973-8f17-dd5152793680-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.217289 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.217257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b6b7df6c-540d-4973-8f17-dd5152793680-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.217387 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.217373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/42404b27-22f2-4667-b5b0-82e7fbc1840e-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-g7dg7\" (UID: \"42404b27-22f2-4667-b5b0-82e7fbc1840e\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7" Apr 22 18:49:02.223277 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.223257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwtcg\" (UniqueName: \"kubernetes.io/projected/b6b7df6c-540d-4973-8f17-dd5152793680-kube-api-access-rwtcg\") pod \"insights-runtime-extractor-rdfsn\" (UID: \"b6b7df6c-540d-4973-8f17-dd5152793680\") " pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.265365 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.265328 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" Apr 22 18:49:02.367158 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.367126 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7" Apr 22 18:49:02.373033 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.373000 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rdfsn" Apr 22 18:49:02.383676 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.383647 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg"] Apr 22 18:49:02.386969 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:02.386942 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b6d1ad8_a365_4cbe_a6c2_78e6855f83c2.slice/crio-5eb75a6594a3ee3b434be1796bf44ad671af82936ca1d70daf6b8a70596f2082 WatchSource:0}: Error finding container 5eb75a6594a3ee3b434be1796bf44ad671af82936ca1d70daf6b8a70596f2082: Status 404 returned error can't find the container with id 5eb75a6594a3ee3b434be1796bf44ad671af82936ca1d70daf6b8a70596f2082 Apr 22 18:49:02.512142 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.512082 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7"] Apr 22 18:49:02.517402 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:02.517375 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42404b27_22f2_4667_b5b0_82e7fbc1840e.slice/crio-dd4b6cb14dc73d587a743bb131dcaac121a8863fc9bfead5832aa21ddc7c661c WatchSource:0}: Error finding container dd4b6cb14dc73d587a743bb131dcaac121a8863fc9bfead5832aa21ddc7c661c: Status 404 returned error can't find the container with id dd4b6cb14dc73d587a743bb131dcaac121a8863fc9bfead5832aa21ddc7c661c Apr 22 18:49:02.532061 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.532030 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rdfsn"] Apr 22 18:49:02.534603 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:02.534570 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6b7df6c_540d_4973_8f17_dd5152793680.slice/crio-a96a22ab270ee7c84983675854d4d32f72abc3d73aa164cc7de0986555db98be WatchSource:0}: Error finding container a96a22ab270ee7c84983675854d4d32f72abc3d73aa164cc7de0986555db98be: Status 404 returned error can't find the container with id a96a22ab270ee7c84983675854d4d32f72abc3d73aa164cc7de0986555db98be Apr 22 18:49:02.848655 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.848607 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rdfsn" event={"ID":"b6b7df6c-540d-4973-8f17-dd5152793680","Type":"ContainerStarted","Data":"aeb6dda6a28c2c984d2f6644c8fcde16aee22da8892bcc31dc2a8ba899e84ae7"} Apr 22 18:49:02.848655 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.848659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rdfsn" event={"ID":"b6b7df6c-540d-4973-8f17-dd5152793680","Type":"ContainerStarted","Data":"a96a22ab270ee7c84983675854d4d32f72abc3d73aa164cc7de0986555db98be"} Apr 22 18:49:02.849598 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.849568 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7" event={"ID":"42404b27-22f2-4667-b5b0-82e7fbc1840e","Type":"ContainerStarted","Data":"dd4b6cb14dc73d587a743bb131dcaac121a8863fc9bfead5832aa21ddc7c661c"} Apr 22 18:49:02.850474 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:02.850445 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" event={"ID":"7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2","Type":"ContainerStarted","Data":"5eb75a6594a3ee3b434be1796bf44ad671af82936ca1d70daf6b8a70596f2082"} Apr 22 18:49:03.856712 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:03.856677 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" event={"ID":"7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2","Type":"ContainerStarted","Data":"a7472d76db48360899f58327558e64b4ee7c4a26cdd22b9ea7880d8365123412"} Apr 22 18:49:03.858219 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:03.858183 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rdfsn" event={"ID":"b6b7df6c-540d-4973-8f17-dd5152793680","Type":"ContainerStarted","Data":"61df117dae9f074f4045e91237065a4bac33b5c645455c429477d49347f194e7"} Apr 22 18:49:03.859439 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:03.859419 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7" event={"ID":"42404b27-22f2-4667-b5b0-82e7fbc1840e","Type":"ContainerStarted","Data":"f63bfc3f2740fcc67611a62625563b74727f8aa9a0e76080f27c4a837722a8c0"} Apr 22 18:49:03.859589 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:03.859573 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7" Apr 22 18:49:03.865309 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:03.865287 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7" Apr 22 18:49:03.871799 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:03.871758 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-c7qdg" podStartSLOduration=1.534497017 podStartE2EDuration="2.871745843s" podCreationTimestamp="2026-04-22 18:49:01 +0000 UTC" firstStartedPulling="2026-04-22 18:49:02.389295892 +0000 UTC m=+139.451780201" lastFinishedPulling="2026-04-22 18:49:03.726544718 +0000 UTC m=+140.789029027" observedRunningTime="2026-04-22 18:49:03.871042072 +0000 UTC m=+140.933526403" watchObservedRunningTime="2026-04-22 18:49:03.871745843 +0000 UTC m=+140.934230174" Apr 22 18:49:03.884616 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:03.884561 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-g7dg7" podStartSLOduration=0.675393676 podStartE2EDuration="1.884547035s" podCreationTimestamp="2026-04-22 18:49:02 +0000 UTC" firstStartedPulling="2026-04-22 18:49:02.51930546 +0000 UTC m=+139.581789768" lastFinishedPulling="2026-04-22 18:49:03.72845881 +0000 UTC m=+140.790943127" observedRunningTime="2026-04-22 18:49:03.884350927 +0000 UTC m=+140.946835271" watchObservedRunningTime="2026-04-22 18:49:03.884547035 +0000 UTC m=+140.947031366" Apr 22 18:49:04.600043 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.600003 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-tc87t"] Apr 22 18:49:04.603170 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.603141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.607245 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.607197 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:49:04.607454 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.607338 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 18:49:04.607454 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.607351 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-s4j9q\"" Apr 22 18:49:04.607550 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.607522 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:49:04.607634 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.607592 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 18:49:04.608198 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.608177 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:49:04.611152 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.611133 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-tc87t"] Apr 22 18:49:04.633483 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.633441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.633639 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.633486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgsm2\" (UniqueName: \"kubernetes.io/projected/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-kube-api-access-bgsm2\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.633639 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.633550 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.633740 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.633647 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.734735 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.734693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.734911 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.734770 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.734911 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.734801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgsm2\" (UniqueName: \"kubernetes.io/projected/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-kube-api-access-bgsm2\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.735132 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.735081 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.735540 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.735517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.737346 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.737320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.737642 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.737615 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.742654 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.742630 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgsm2\" (UniqueName: \"kubernetes.io/projected/3b630301-6ad2-44fd-bf82-2128fb0cbf7f-kube-api-access-bgsm2\") pod \"prometheus-operator-5676c8c784-tc87t\" (UID: \"3b630301-6ad2-44fd-bf82-2128fb0cbf7f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:04.916656 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:04.916566 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" Apr 22 18:49:05.197974 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:05.197943 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-tc87t"] Apr 22 18:49:05.201659 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:05.201619 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b630301_6ad2_44fd_bf82_2128fb0cbf7f.slice/crio-db25faa1504293fb2bf269a69b8b0b2cfdf586ad942ae4d90484793db444ed55 WatchSource:0}: Error finding container db25faa1504293fb2bf269a69b8b0b2cfdf586ad942ae4d90484793db444ed55: Status 404 returned error can't find the container with id db25faa1504293fb2bf269a69b8b0b2cfdf586ad942ae4d90484793db444ed55 Apr 22 18:49:05.865775 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:05.865702 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" event={"ID":"3b630301-6ad2-44fd-bf82-2128fb0cbf7f","Type":"ContainerStarted","Data":"db25faa1504293fb2bf269a69b8b0b2cfdf586ad942ae4d90484793db444ed55"} Apr 22 18:49:05.867619 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:05.867583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rdfsn" event={"ID":"b6b7df6c-540d-4973-8f17-dd5152793680","Type":"ContainerStarted","Data":"6bd86b5afee076d02ac26df749d0fd090d9d5ce7aeb68cbdfd48922b41812d3d"} Apr 22 18:49:05.885947 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:05.885896 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rdfsn" podStartSLOduration=1.34978505 podStartE2EDuration="3.88587804s" podCreationTimestamp="2026-04-22 18:49:02 +0000 UTC" firstStartedPulling="2026-04-22 18:49:02.590336289 +0000 UTC m=+139.652820605" lastFinishedPulling="2026-04-22 18:49:05.126429282 +0000 UTC m=+142.188913595" observedRunningTime="2026-04-22 18:49:05.883873078 +0000 UTC m=+142.946357412" watchObservedRunningTime="2026-04-22 18:49:05.88587804 +0000 UTC m=+142.948362391" Apr 22 18:49:06.872095 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:06.872050 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" event={"ID":"3b630301-6ad2-44fd-bf82-2128fb0cbf7f","Type":"ContainerStarted","Data":"a06386973c307e603773033c51c8bf27315e5536f0c4c68dcccc2db7f9e2fd15"} Apr 22 18:49:06.872517 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:06.872103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" event={"ID":"3b630301-6ad2-44fd-bf82-2128fb0cbf7f","Type":"ContainerStarted","Data":"921ef88c61daee14c8e428e97f3a66a4c806eccda0c7854edec4e77442f51d19"} Apr 22 18:49:06.888754 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:06.888710 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-tc87t" podStartSLOduration=1.647498987 podStartE2EDuration="2.888694969s" podCreationTimestamp="2026-04-22 18:49:04 +0000 UTC" firstStartedPulling="2026-04-22 18:49:05.203666568 +0000 UTC m=+142.266150877" lastFinishedPulling="2026-04-22 18:49:06.444862538 +0000 UTC m=+143.507346859" observedRunningTime="2026-04-22 18:49:06.887247659 +0000 UTC m=+143.949731990" watchObservedRunningTime="2026-04-22 18:49:06.888694969 +0000 UTC m=+143.951179299" Apr 22 18:49:08.939580 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.939541 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj"] Apr 22 18:49:08.944630 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.944609 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:08.947693 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.947661 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:49:08.947693 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.947689 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:49:08.948102 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.947962 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-xt95p\"" Apr 22 18:49:08.952415 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.952392 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj"] Apr 22 18:49:08.955327 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.955303 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6ct76"] Apr 22 18:49:08.959257 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.959238 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:08.962361 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.962338 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:49:08.962486 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.962420 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:49:08.962486 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.962449 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:49:08.962812 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.962792 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nltc8\"" Apr 22 18:49:08.972234 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.971085 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78a4a419-e37d-403e-996d-669a974f998b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:08.972234 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.971164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78a4a419-e37d-403e-996d-669a974f998b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:08.972234 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.971250 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwbv6\" (UniqueName: \"kubernetes.io/projected/78a4a419-e37d-403e-996d-669a974f998b-kube-api-access-rwbv6\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:08.972234 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:08.971322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78a4a419-e37d-403e-996d-669a974f998b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:09.071743 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.071701 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.071743 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.071743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-tls\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.072001 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.071762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b7cef634-5e2e-4422-9c18-541b2a2e35a4-root\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.072001 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.071796 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78a4a419-e37d-403e-996d-669a974f998b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:09.072001 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.071848 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.072145 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.071994 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7cef634-5e2e-4422-9c18-541b2a2e35a4-metrics-client-ca\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.072145 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.072046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78a4a419-e37d-403e-996d-669a974f998b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:09.072145 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.072117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-wtmp\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.072304 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.072148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkwtn\" (UniqueName: \"kubernetes.io/projected/b7cef634-5e2e-4422-9c18-541b2a2e35a4-kube-api-access-fkwtn\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.072304 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.072173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7cef634-5e2e-4422-9c18-541b2a2e35a4-sys\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.072304 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.072203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-textfile\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.072450 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.072373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwbv6\" (UniqueName: \"kubernetes.io/projected/78a4a419-e37d-403e-996d-669a974f998b-kube-api-access-rwbv6\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:09.072636 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.072612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78a4a419-e37d-403e-996d-669a974f998b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:09.072951 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.072928 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78a4a419-e37d-403e-996d-669a974f998b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:09.075191 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.075165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78a4a419-e37d-403e-996d-669a974f998b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:09.075310 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.075289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/78a4a419-e37d-403e-996d-669a974f998b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:09.080478 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.080453 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwbv6\" (UniqueName: \"kubernetes.io/projected/78a4a419-e37d-403e-996d-669a974f998b-kube-api-access-rwbv6\") pod \"openshift-state-metrics-9d44df66c-c74zj\" (UID: \"78a4a419-e37d-403e-996d-669a974f998b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:09.173516 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.173480 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.173754 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.173733 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7cef634-5e2e-4422-9c18-541b2a2e35a4-metrics-client-ca\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.174507 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.174484 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-wtmp\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.174818 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.174800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fkwtn\" (UniqueName: \"kubernetes.io/projected/b7cef634-5e2e-4422-9c18-541b2a2e35a4-kube-api-access-fkwtn\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.174946 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.174931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7cef634-5e2e-4422-9c18-541b2a2e35a4-sys\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.175057 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.175042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-textfile\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.175282 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.175260 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.175418 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.175404 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-tls\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.175525 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.175512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b7cef634-5e2e-4422-9c18-541b2a2e35a4-root\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.175692 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.175677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b7cef634-5e2e-4422-9c18-541b2a2e35a4-root\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.175792 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.174410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b7cef634-5e2e-4422-9c18-541b2a2e35a4-metrics-client-ca\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.176163 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.174758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-wtmp\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.176359 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.176343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7cef634-5e2e-4422-9c18-541b2a2e35a4-sys\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.178104 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.177385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-accelerators-collector-config\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.178104 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.177848 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.178104 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.178049 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-textfile\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.179778 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.179754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b7cef634-5e2e-4422-9c18-541b2a2e35a4-node-exporter-tls\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.183291 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.183268 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkwtn\" (UniqueName: \"kubernetes.io/projected/b7cef634-5e2e-4422-9c18-541b2a2e35a4-kube-api-access-fkwtn\") pod \"node-exporter-6ct76\" (UID: \"b7cef634-5e2e-4422-9c18-541b2a2e35a4\") " pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.258569 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.258486 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" Apr 22 18:49:09.271285 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.271243 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6ct76" Apr 22 18:49:09.279368 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:09.279340 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7cef634_5e2e_4422_9c18_541b2a2e35a4.slice/crio-55dcc089cefdca73f0d6a7bb4507b5c285502c502933a02f0dbec423327cddfc WatchSource:0}: Error finding container 55dcc089cefdca73f0d6a7bb4507b5c285502c502933a02f0dbec423327cddfc: Status 404 returned error can't find the container with id 55dcc089cefdca73f0d6a7bb4507b5c285502c502933a02f0dbec423327cddfc Apr 22 18:49:09.378300 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.378267 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj"] Apr 22 18:49:09.381274 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:09.381230 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78a4a419_e37d_403e_996d_669a974f998b.slice/crio-33cc84e776b4d97da8c50641beaa7a80d538980be955c976e199a72ae29d3408 WatchSource:0}: Error finding container 33cc84e776b4d97da8c50641beaa7a80d538980be955c976e199a72ae29d3408: Status 404 returned error can't find the container with id 33cc84e776b4d97da8c50641beaa7a80d538980be955c976e199a72ae29d3408 Apr 22 18:49:09.882273 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.882239 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" event={"ID":"78a4a419-e37d-403e-996d-669a974f998b","Type":"ContainerStarted","Data":"aeca3d9595d4722d161c1b9c1358ac84fb42a110a08e885a42a61b5653a9e91d"} Apr 22 18:49:09.882273 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.882275 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" event={"ID":"78a4a419-e37d-403e-996d-669a974f998b","Type":"ContainerStarted","Data":"19e10c4f20e428f57e7895f089991d5c0fe3ce16d5df0eed1172751d3df621ba"} Apr 22 18:49:09.882522 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.882286 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" event={"ID":"78a4a419-e37d-403e-996d-669a974f998b","Type":"ContainerStarted","Data":"33cc84e776b4d97da8c50641beaa7a80d538980be955c976e199a72ae29d3408"} Apr 22 18:49:09.883409 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:09.883384 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6ct76" event={"ID":"b7cef634-5e2e-4422-9c18-541b2a2e35a4","Type":"ContainerStarted","Data":"55dcc089cefdca73f0d6a7bb4507b5c285502c502933a02f0dbec423327cddfc"} Apr 22 18:49:10.887987 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.887887 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" event={"ID":"78a4a419-e37d-403e-996d-669a974f998b","Type":"ContainerStarted","Data":"bf5782b63ccd5248d71ea899409499e1086f0c88b5dad30613365159b1366639"} Apr 22 18:49:10.889309 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.889285 2575 generic.go:358] "Generic (PLEG): container finished" podID="b7cef634-5e2e-4422-9c18-541b2a2e35a4" containerID="39b375cf885863a71cef3093e1f60a76f24c1935591691b124d7091ba69e6a25" exitCode=0 Apr 22 18:49:10.889393 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.889329 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6ct76" event={"ID":"b7cef634-5e2e-4422-9c18-541b2a2e35a4","Type":"ContainerDied","Data":"39b375cf885863a71cef3093e1f60a76f24c1935591691b124d7091ba69e6a25"} Apr 22 18:49:10.907362 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.907310 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-c74zj" podStartSLOduration=1.831126533 podStartE2EDuration="2.907293426s" podCreationTimestamp="2026-04-22 18:49:08 +0000 UTC" firstStartedPulling="2026-04-22 18:49:09.503885444 +0000 UTC m=+146.566369753" lastFinishedPulling="2026-04-22 18:49:10.580052337 +0000 UTC m=+147.642536646" observedRunningTime="2026-04-22 18:49:10.904784288 +0000 UTC m=+147.967268619" watchObservedRunningTime="2026-04-22 18:49:10.907293426 +0000 UTC m=+147.969777757" Apr 22 18:49:10.933242 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.933197 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr"] Apr 22 18:49:10.937898 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.937873 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:10.940791 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.940759 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 18:49:10.941068 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.941046 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 18:49:10.941144 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.941111 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 18:49:10.941230 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.941149 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 18:49:10.941230 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.941155 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 18:49:10.941230 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.941176 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-hxn7q\"" Apr 22 18:49:10.941230 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.941177 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-5nb12h1oirvm1\"" Apr 22 18:49:10.949631 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.949602 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr"] Apr 22 18:49:10.990403 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.990378 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-grpc-tls\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:10.990520 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.990419 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41cbca3c-525e-4447-aa48-6fec0572815b-metrics-client-ca\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:10.990520 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.990499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:10.990642 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.990528 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:10.990642 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.990623 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klrnc\" (UniqueName: \"kubernetes.io/projected/41cbca3c-525e-4447-aa48-6fec0572815b-kube-api-access-klrnc\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:10.990808 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.990786 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-tls\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:10.991047 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.990920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:10.991047 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:10.990968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.092185 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.092147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.092185 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.092190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.092470 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.092346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klrnc\" (UniqueName: \"kubernetes.io/projected/41cbca3c-525e-4447-aa48-6fec0572815b-kube-api-access-klrnc\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.092470 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.092385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-tls\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.092470 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.092433 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.092470 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.092453 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.092725 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.092526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-grpc-tls\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.092725 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.092555 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41cbca3c-525e-4447-aa48-6fec0572815b-metrics-client-ca\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.093645 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.093562 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41cbca3c-525e-4447-aa48-6fec0572815b-metrics-client-ca\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.095712 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.095624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.095712 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.095676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.095899 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.095783 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.095899 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.095860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.099158 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.096039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-thanos-querier-tls\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.099158 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.096060 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/41cbca3c-525e-4447-aa48-6fec0572815b-secret-grpc-tls\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.102045 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.102024 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klrnc\" (UniqueName: \"kubernetes.io/projected/41cbca3c-525e-4447-aa48-6fec0572815b-kube-api-access-klrnc\") pod \"thanos-querier-545fb8c4d5-cbqvr\" (UID: \"41cbca3c-525e-4447-aa48-6fec0572815b\") " pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.248232 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.248100 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:11.374909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.374874 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr"] Apr 22 18:49:11.378130 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:11.378097 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41cbca3c_525e_4447_aa48_6fec0572815b.slice/crio-4761fed053675881ac391300256093c841bc124beb410c9e4a9f19e5085dea60 WatchSource:0}: Error finding container 4761fed053675881ac391300256093c841bc124beb410c9e4a9f19e5085dea60: Status 404 returned error can't find the container with id 4761fed053675881ac391300256093c841bc124beb410c9e4a9f19e5085dea60 Apr 22 18:49:11.894058 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.894015 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6ct76" event={"ID":"b7cef634-5e2e-4422-9c18-541b2a2e35a4","Type":"ContainerStarted","Data":"40eb959d87f18a6dea3aeeae8e00cdb69e44528118cb277a0a6fd18ddc82959f"} Apr 22 18:49:11.894058 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.894057 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6ct76" event={"ID":"b7cef634-5e2e-4422-9c18-541b2a2e35a4","Type":"ContainerStarted","Data":"e3dbf496a716713c0ae18ea5c2e83e0d7754bc62dcc6f1b95b35d56d0eff1ee1"} Apr 22 18:49:11.895172 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.895142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" event={"ID":"41cbca3c-525e-4447-aa48-6fec0572815b","Type":"ContainerStarted","Data":"4761fed053675881ac391300256093c841bc124beb410c9e4a9f19e5085dea60"} Apr 22 18:49:11.916665 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:11.916616 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6ct76" podStartSLOduration=3.285491519 podStartE2EDuration="3.916601547s" podCreationTimestamp="2026-04-22 18:49:08 +0000 UTC" firstStartedPulling="2026-04-22 18:49:09.280993163 +0000 UTC m=+146.343477472" lastFinishedPulling="2026-04-22 18:49:09.912103188 +0000 UTC m=+146.974587500" observedRunningTime="2026-04-22 18:49:11.915031206 +0000 UTC m=+148.977515562" watchObservedRunningTime="2026-04-22 18:49:11.916601547 +0000 UTC m=+148.979085878" Apr 22 18:49:12.026417 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:12.026378 2575 patch_prober.go:28] interesting pod/image-registry-794b4685b8-4n2n9 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:49:12.026597 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:12.026438 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" podUID="b3d6b355-22f1-4db5-a332-667de73ccfa1" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:49:13.711370 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.711335 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp"] Apr 22 18:49:13.715513 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.715485 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp" Apr 22 18:49:13.718039 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.718014 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 18:49:13.718180 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.718048 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-7vwnj\"" Apr 22 18:49:13.721676 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.721652 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp"] Apr 22 18:49:13.817805 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.817770 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7699e9a-8c09-4e2b-80d0-2164298c3efb-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jcxvp\" (UID: \"c7699e9a-8c09-4e2b-80d0-2164298c3efb\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp" Apr 22 18:49:13.905382 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.905344 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" event={"ID":"41cbca3c-525e-4447-aa48-6fec0572815b","Type":"ContainerStarted","Data":"41124d1604a79e37d6890f9cf0c7bf8cb4fc206f97460840dd2fef8b45a4c5d2"} Apr 22 18:49:13.905382 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.905385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" event={"ID":"41cbca3c-525e-4447-aa48-6fec0572815b","Type":"ContainerStarted","Data":"36e2f437efc028d3f21177e25f737fcdfc2950f9911e58e76b18a01546350c72"} Apr 22 18:49:13.905583 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.905395 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" event={"ID":"41cbca3c-525e-4447-aa48-6fec0572815b","Type":"ContainerStarted","Data":"ece4e0481f3822cbf5acce51ec984d22ae6d43c07bb38bd5f8892835d1b3ef44"} Apr 22 18:49:13.918821 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.918793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7699e9a-8c09-4e2b-80d0-2164298c3efb-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jcxvp\" (UID: \"c7699e9a-8c09-4e2b-80d0-2164298c3efb\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp" Apr 22 18:49:13.921248 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:13.921202 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7699e9a-8c09-4e2b-80d0-2164298c3efb-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jcxvp\" (UID: \"c7699e9a-8c09-4e2b-80d0-2164298c3efb\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp" Apr 22 18:49:14.027756 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:14.027723 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp" Apr 22 18:49:14.183402 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:14.183337 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp"] Apr 22 18:49:14.278361 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:14.278285 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7699e9a_8c09_4e2b_80d0_2164298c3efb.slice/crio-4dcd2015bebb476026905bd3ec9d3d863f682357e6da7d8fb4e9bd147b97de33 WatchSource:0}: Error finding container 4dcd2015bebb476026905bd3ec9d3d863f682357e6da7d8fb4e9bd147b97de33: Status 404 returned error can't find the container with id 4dcd2015bebb476026905bd3ec9d3d863f682357e6da7d8fb4e9bd147b97de33 Apr 22 18:49:14.910185 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:14.910147 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp" event={"ID":"c7699e9a-8c09-4e2b-80d0-2164298c3efb","Type":"ContainerStarted","Data":"4dcd2015bebb476026905bd3ec9d3d863f682357e6da7d8fb4e9bd147b97de33"} Apr 22 18:49:14.914519 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:14.914477 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" event={"ID":"41cbca3c-525e-4447-aa48-6fec0572815b","Type":"ContainerStarted","Data":"ce0da42b38b4d3d324a28a345bcb6a429d05fc82e6f2cafa305b499c26fdb698"} Apr 22 18:49:14.914685 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:14.914529 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" event={"ID":"41cbca3c-525e-4447-aa48-6fec0572815b","Type":"ContainerStarted","Data":"52ffa35cc47b8da0b654c6a526597b31aebc2fb30b0743a6f3b563bc601d1194"} Apr 22 18:49:14.914685 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:14.914547 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" event={"ID":"41cbca3c-525e-4447-aa48-6fec0572815b","Type":"ContainerStarted","Data":"b3ad1c0fbf142672e38227765fe84e359eaa6c2c37c2fb1bc685b41a87a60a65"} Apr 22 18:49:14.914841 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:14.914808 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:14.952798 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:14.952740 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" podStartSLOduration=2.010427548 podStartE2EDuration="4.952715308s" podCreationTimestamp="2026-04-22 18:49:10 +0000 UTC" firstStartedPulling="2026-04-22 18:49:11.380081833 +0000 UTC m=+148.442566143" lastFinishedPulling="2026-04-22 18:49:14.322369591 +0000 UTC m=+151.384853903" observedRunningTime="2026-04-22 18:49:14.95066504 +0000 UTC m=+152.013149372" watchObservedRunningTime="2026-04-22 18:49:14.952715308 +0000 UTC m=+152.015199640" Apr 22 18:49:15.132962 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.132903 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:49:15.138892 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.138863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.142113 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.142085 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:49:15.142282 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.142154 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:49:15.142344 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.142099 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:49:15.142344 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.142337 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fxthp\"" Apr 22 18:49:15.142623 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.142593 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:49:15.142719 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.142593 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:49:15.143355 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.143332 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:49:15.143594 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.143574 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:49:15.143854 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.143832 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:49:15.144361 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.144308 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:49:15.144690 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.144669 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:49:15.144767 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.144747 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:49:15.145293 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.144934 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:49:15.145769 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.145752 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7vugm3i6c7sme\"" Apr 22 18:49:15.146511 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.146492 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:49:15.151143 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.151094 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:49:15.231333 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdqqb\" (UniqueName: \"kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-kube-api-access-jdqqb\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231333 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231333 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231311 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231333 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231331 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231645 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231645 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231372 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231645 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231417 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231645 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-web-config\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231645 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231488 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231645 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231645 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231645 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231645 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231967 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231683 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231967 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231709 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config-out\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231967 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231967 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231780 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.231967 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.231811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.332588 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.332547 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.332588 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.332595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config-out\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.332802 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.332783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.332859 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.332842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.332901 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.332888 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.332955 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.332940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdqqb\" (UniqueName: \"kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-kube-api-access-jdqqb\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333002 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.332985 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333035 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333096 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333096 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333090 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333195 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333195 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333195 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-web-config\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333355 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333355 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333301 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333355 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333499 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.333499 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.333410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.337905 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.334145 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.337905 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.334326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.337905 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.334552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.337905 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.337061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.337905 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.337337 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.337905 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.337568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.338886 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.337915 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.338886 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.338659 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.339664 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.339637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.341088 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.340566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.341088 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.340636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-web-config\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.341088 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.341043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.341329 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.341230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config-out\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.341707 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.341682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.342036 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.341994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.342767 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.342746 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.343622 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.343596 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.348071 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.348005 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdqqb\" (UniqueName: \"kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-kube-api-access-jdqqb\") pod \"prometheus-k8s-0\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.456070 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.456031 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:15.743012 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.742985 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:49:15.747310 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:15.747282 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda73f90a8_4fcc_4bf4_84c0_c52d6ebcee02.slice/crio-a0a7256bca79617d0a8d82e7a3e6fd5ab766056a577afc0e01edac65898ef224 WatchSource:0}: Error finding container a0a7256bca79617d0a8d82e7a3e6fd5ab766056a577afc0e01edac65898ef224: Status 404 returned error can't find the container with id a0a7256bca79617d0a8d82e7a3e6fd5ab766056a577afc0e01edac65898ef224 Apr 22 18:49:15.918409 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.918318 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp" event={"ID":"c7699e9a-8c09-4e2b-80d0-2164298c3efb","Type":"ContainerStarted","Data":"5615605c28c3a04f331fe0129a06d60babf9a9918bf6d6f1e168569edb30186b"} Apr 22 18:49:15.918821 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.918500 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp" Apr 22 18:49:15.919557 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.919530 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerStarted","Data":"a0a7256bca79617d0a8d82e7a3e6fd5ab766056a577afc0e01edac65898ef224"} Apr 22 18:49:15.923270 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.923248 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp" Apr 22 18:49:15.934755 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:15.934713 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jcxvp" podStartSLOduration=1.548711629 podStartE2EDuration="2.934699148s" podCreationTimestamp="2026-04-22 18:49:13 +0000 UTC" firstStartedPulling="2026-04-22 18:49:14.280374379 +0000 UTC m=+151.342858688" lastFinishedPulling="2026-04-22 18:49:15.666361886 +0000 UTC m=+152.728846207" observedRunningTime="2026-04-22 18:49:15.933876331 +0000 UTC m=+152.996360663" watchObservedRunningTime="2026-04-22 18:49:15.934699148 +0000 UTC m=+152.997183479" Apr 22 18:49:16.924344 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:16.924257 2575 generic.go:358] "Generic (PLEG): container finished" podID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerID="15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527" exitCode=0 Apr 22 18:49:16.924703 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:16.924350 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerDied","Data":"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527"} Apr 22 18:49:19.841463 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:49:19.841414 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5xxjt" podUID="a708e3d6-d406-4cfd-ab5f-8dd221a9fd88" Apr 22 18:49:19.845767 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:49:19.845735 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-lv9p4" podUID="a39d7311-24fb-45df-884c-194983c0905a" Apr 22 18:49:19.945174 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:19.945134 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerStarted","Data":"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885"} Apr 22 18:49:19.945314 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:19.945188 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerStarted","Data":"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d"} Apr 22 18:49:19.945314 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:19.945190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:49:19.945459 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:19.945419 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5xxjt" Apr 22 18:49:20.926552 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:20.926525 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-545fb8c4d5-cbqvr" Apr 22 18:49:20.950965 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:20.950930 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerStarted","Data":"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce"} Apr 22 18:49:20.950965 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:20.950968 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerStarted","Data":"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952"} Apr 22 18:49:20.951184 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:20.950980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerStarted","Data":"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273"} Apr 22 18:49:20.951184 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:20.950989 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerStarted","Data":"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc"} Apr 22 18:49:20.976527 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:20.976465 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.012696688 podStartE2EDuration="5.976448566s" podCreationTimestamp="2026-04-22 18:49:15 +0000 UTC" firstStartedPulling="2026-04-22 18:49:15.749657827 +0000 UTC m=+152.812142141" lastFinishedPulling="2026-04-22 18:49:19.713409705 +0000 UTC m=+156.775894019" observedRunningTime="2026-04-22 18:49:20.975052248 +0000 UTC m=+158.037536614" watchObservedRunningTime="2026-04-22 18:49:20.976448566 +0000 UTC m=+158.038932897" Apr 22 18:49:22.024177 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:22.024128 2575 patch_prober.go:28] interesting pod/image-registry-794b4685b8-4n2n9 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:49:22.024616 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:22.024202 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" podUID="b3d6b355-22f1-4db5-a332-667de73ccfa1" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:49:24.723181 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.723136 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:49:24.723181 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.723185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:49:24.725540 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.725511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a39d7311-24fb-45df-884c-194983c0905a-cert\") pod \"ingress-canary-lv9p4\" (UID: \"a39d7311-24fb-45df-884c-194983c0905a\") " pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:49:24.725815 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.725798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a708e3d6-d406-4cfd-ab5f-8dd221a9fd88-metrics-tls\") pod \"dns-default-5xxjt\" (UID: \"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88\") " pod="openshift-dns/dns-default-5xxjt" Apr 22 18:49:24.748597 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.748570 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ctxn4\"" Apr 22 18:49:24.749671 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.749654 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tdfh7\"" Apr 22 18:49:24.756718 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.756698 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5xxjt" Apr 22 18:49:24.756787 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.756753 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lv9p4" Apr 22 18:49:24.906389 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.906363 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lv9p4"] Apr 22 18:49:24.909252 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:24.909202 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda39d7311_24fb_45df_884c_194983c0905a.slice/crio-42b04b2eb698b7226e0b4bd95984cba37566a4b20543ddaf503476809d83da57 WatchSource:0}: Error finding container 42b04b2eb698b7226e0b4bd95984cba37566a4b20543ddaf503476809d83da57: Status 404 returned error can't find the container with id 42b04b2eb698b7226e0b4bd95984cba37566a4b20543ddaf503476809d83da57 Apr 22 18:49:24.925650 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.925624 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5xxjt"] Apr 22 18:49:24.928386 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:49:24.928360 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda708e3d6_d406_4cfd_ab5f_8dd221a9fd88.slice/crio-6edf55df7940851ba40b21b883a2208d7ac544640f5ed1f252a59a016ef8e7ab WatchSource:0}: Error finding container 6edf55df7940851ba40b21b883a2208d7ac544640f5ed1f252a59a016ef8e7ab: Status 404 returned error can't find the container with id 6edf55df7940851ba40b21b883a2208d7ac544640f5ed1f252a59a016ef8e7ab Apr 22 18:49:24.966446 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.966403 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lv9p4" event={"ID":"a39d7311-24fb-45df-884c-194983c0905a","Type":"ContainerStarted","Data":"42b04b2eb698b7226e0b4bd95984cba37566a4b20543ddaf503476809d83da57"} Apr 22 18:49:24.967478 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:24.967452 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5xxjt" event={"ID":"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88","Type":"ContainerStarted","Data":"6edf55df7940851ba40b21b883a2208d7ac544640f5ed1f252a59a016ef8e7ab"} Apr 22 18:49:25.457132 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:25.457096 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:49:26.978566 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:26.978533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lv9p4" event={"ID":"a39d7311-24fb-45df-884c-194983c0905a","Type":"ContainerStarted","Data":"3987092d1c8c422b54214c17868cd93a2c9b09397ae1a852ae7f8dd05e4242e2"} Apr 22 18:49:26.980233 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:26.980192 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5xxjt" event={"ID":"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88","Type":"ContainerStarted","Data":"8d307489819466561d41809a0b82c1edca6755a39cf7b2f0f4335e131f651cf6"} Apr 22 18:49:26.997376 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:26.997261 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lv9p4" podStartSLOduration=129.158128839 podStartE2EDuration="2m10.997241944s" podCreationTimestamp="2026-04-22 18:47:16 +0000 UTC" firstStartedPulling="2026-04-22 18:49:24.91118669 +0000 UTC m=+161.973670999" lastFinishedPulling="2026-04-22 18:49:26.75029978 +0000 UTC m=+163.812784104" observedRunningTime="2026-04-22 18:49:26.99525032 +0000 UTC m=+164.057734651" watchObservedRunningTime="2026-04-22 18:49:26.997241944 +0000 UTC m=+164.059726276" Apr 22 18:49:27.037983 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.037937 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" podUID="b3d6b355-22f1-4db5-a332-667de73ccfa1" containerName="registry" containerID="cri-o://233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6" gracePeriod=30 Apr 22 18:49:27.276853 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.276828 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:49:27.445117 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.445064 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls\") pod \"b3d6b355-22f1-4db5-a332-667de73ccfa1\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " Apr 22 18:49:27.445117 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.445121 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrmps\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-kube-api-access-jrmps\") pod \"b3d6b355-22f1-4db5-a332-667de73ccfa1\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " Apr 22 18:49:27.445421 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.445152 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-trusted-ca\") pod \"b3d6b355-22f1-4db5-a332-667de73ccfa1\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " Apr 22 18:49:27.445421 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.445172 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-image-registry-private-configuration\") pod \"b3d6b355-22f1-4db5-a332-667de73ccfa1\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " Apr 22 18:49:27.445421 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.445197 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-certificates\") pod \"b3d6b355-22f1-4db5-a332-667de73ccfa1\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " Apr 22 18:49:27.445421 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.445260 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3d6b355-22f1-4db5-a332-667de73ccfa1-ca-trust-extracted\") pod \"b3d6b355-22f1-4db5-a332-667de73ccfa1\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " Apr 22 18:49:27.445421 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.445285 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-bound-sa-token\") pod \"b3d6b355-22f1-4db5-a332-667de73ccfa1\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " Apr 22 18:49:27.445421 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.445343 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-installation-pull-secrets\") pod \"b3d6b355-22f1-4db5-a332-667de73ccfa1\" (UID: \"b3d6b355-22f1-4db5-a332-667de73ccfa1\") " Apr 22 18:49:27.445717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.445693 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b3d6b355-22f1-4db5-a332-667de73ccfa1" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:27.445769 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.445744 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b3d6b355-22f1-4db5-a332-667de73ccfa1" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:27.447769 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.447744 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b3d6b355-22f1-4db5-a332-667de73ccfa1" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:27.447922 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.447891 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b3d6b355-22f1-4db5-a332-667de73ccfa1" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:27.447983 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.447905 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b3d6b355-22f1-4db5-a332-667de73ccfa1" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:27.447983 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.447961 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-kube-api-access-jrmps" (OuterVolumeSpecName: "kube-api-access-jrmps") pod "b3d6b355-22f1-4db5-a332-667de73ccfa1" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1"). InnerVolumeSpecName "kube-api-access-jrmps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:27.448379 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.448365 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b3d6b355-22f1-4db5-a332-667de73ccfa1" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:27.453380 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.453355 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d6b355-22f1-4db5-a332-667de73ccfa1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b3d6b355-22f1-4db5-a332-667de73ccfa1" (UID: "b3d6b355-22f1-4db5-a332-667de73ccfa1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:27.546714 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.546682 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-trusted-ca\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:49:27.546714 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.546713 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-image-registry-private-configuration\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:49:27.546914 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.546723 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-certificates\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:49:27.546914 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.546734 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b3d6b355-22f1-4db5-a332-667de73ccfa1-ca-trust-extracted\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:49:27.546914 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.546743 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-bound-sa-token\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:49:27.546914 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.546753 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b3d6b355-22f1-4db5-a332-667de73ccfa1-installation-pull-secrets\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:49:27.546914 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.546762 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-registry-tls\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:49:27.546914 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.546770 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jrmps\" (UniqueName: \"kubernetes.io/projected/b3d6b355-22f1-4db5-a332-667de73ccfa1-kube-api-access-jrmps\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:49:27.984646 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.984558 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5xxjt" event={"ID":"a708e3d6-d406-4cfd-ab5f-8dd221a9fd88","Type":"ContainerStarted","Data":"027678ada0e74b263970e9e6bc6913031fcd6fe22c2efdc980ffc44fb7cd9848"} Apr 22 18:49:27.985063 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.984653 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5xxjt" Apr 22 18:49:27.985781 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.985757 2575 generic.go:358] "Generic (PLEG): container finished" podID="b3d6b355-22f1-4db5-a332-667de73ccfa1" containerID="233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6" exitCode=0 Apr 22 18:49:27.985834 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.985803 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" Apr 22 18:49:27.985869 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.985840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" event={"ID":"b3d6b355-22f1-4db5-a332-667de73ccfa1","Type":"ContainerDied","Data":"233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6"} Apr 22 18:49:27.985947 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.985866 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-794b4685b8-4n2n9" event={"ID":"b3d6b355-22f1-4db5-a332-667de73ccfa1","Type":"ContainerDied","Data":"aea67e27e267fe66c378ae700848707c3783f41ed895bf309395f24783bc7970"} Apr 22 18:49:27.985947 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.985884 2575 scope.go:117] "RemoveContainer" containerID="233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6" Apr 22 18:49:27.993921 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.993902 2575 scope.go:117] "RemoveContainer" containerID="233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6" Apr 22 18:49:27.994189 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:49:27.994166 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6\": container with ID starting with 233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6 not found: ID does not exist" containerID="233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6" Apr 22 18:49:27.994275 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:27.994197 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6"} err="failed to get container status \"233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6\": rpc error: code = NotFound desc = could not find container \"233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6\": container with ID starting with 233dd5513f465c5e096c16357316ee8b701bebb8af59ff853558e9dbcbd1d6d6 not found: ID does not exist" Apr 22 18:49:28.002116 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:28.002069 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5xxjt" podStartSLOduration=130.185308093 podStartE2EDuration="2m12.002057336s" podCreationTimestamp="2026-04-22 18:47:16 +0000 UTC" firstStartedPulling="2026-04-22 18:49:24.930177853 +0000 UTC m=+161.992662162" lastFinishedPulling="2026-04-22 18:49:26.74692709 +0000 UTC m=+163.809411405" observedRunningTime="2026-04-22 18:49:28.001014861 +0000 UTC m=+165.063499192" watchObservedRunningTime="2026-04-22 18:49:28.002057336 +0000 UTC m=+165.064541649" Apr 22 18:49:28.028705 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:28.028671 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-794b4685b8-4n2n9"] Apr 22 18:49:28.032459 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:28.032432 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-794b4685b8-4n2n9"] Apr 22 18:49:29.474749 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:29.474706 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d6b355-22f1-4db5-a332-667de73ccfa1" path="/var/lib/kubelet/pods/b3d6b355-22f1-4db5-a332-667de73ccfa1/volumes" Apr 22 18:49:37.992891 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:37.992812 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5xxjt" Apr 22 18:49:47.041915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:47.041880 2575 generic.go:358] "Generic (PLEG): container finished" podID="b9d9c3b2-8e29-4384-acf5-23ae2b9445a0" containerID="3c7571c11f8d5f85cbdcef160cf29ba841d00f2ac5d4c093e776c1e8cace3bb5" exitCode=0 Apr 22 18:49:47.042328 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:47.041957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" event={"ID":"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0","Type":"ContainerDied","Data":"3c7571c11f8d5f85cbdcef160cf29ba841d00f2ac5d4c093e776c1e8cace3bb5"} Apr 22 18:49:47.042379 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:47.042329 2575 scope.go:117] "RemoveContainer" containerID="3c7571c11f8d5f85cbdcef160cf29ba841d00f2ac5d4c093e776c1e8cace3bb5" Apr 22 18:49:48.048778 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:48.048743 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-vnbdg" event={"ID":"b9d9c3b2-8e29-4384-acf5-23ae2b9445a0","Type":"ContainerStarted","Data":"9d98380d7a264896b414e0c71ba8eb74c506b41387fece99cec263c7672eec7c"} Apr 22 18:49:56.072753 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:56.072716 2575 generic.go:358] "Generic (PLEG): container finished" podID="c13de4f5-3c84-460a-9d6a-f2c326e0eef9" containerID="702463e12cfbef9707673622a335433d02b094f70e7b9f76e169dcc6a1b0b287" exitCode=0 Apr 22 18:49:56.073157 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:56.072786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wprxw" event={"ID":"c13de4f5-3c84-460a-9d6a-f2c326e0eef9","Type":"ContainerDied","Data":"702463e12cfbef9707673622a335433d02b094f70e7b9f76e169dcc6a1b0b287"} Apr 22 18:49:56.073157 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:56.073135 2575 scope.go:117] "RemoveContainer" containerID="702463e12cfbef9707673622a335433d02b094f70e7b9f76e169dcc6a1b0b287" Apr 22 18:49:57.081273 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:49:57.077766 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-wprxw" event={"ID":"c13de4f5-3c84-460a-9d6a-f2c326e0eef9","Type":"ContainerStarted","Data":"1dd3b676300ca7a272370863ed3ada5bd42becf416aa69e221853533ba5a1445"} Apr 22 18:50:15.457345 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:15.457300 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:15.476892 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:15.476865 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:16.155400 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:16.155369 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:33.476709 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.476675 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:50:33.477396 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.477264 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy" containerID="cri-o://2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952" gracePeriod=600 Apr 22 18:50:33.477396 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.477315 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="thanos-sidecar" containerID="cri-o://cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc" gracePeriod=600 Apr 22 18:50:33.477396 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.477332 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="config-reloader" containerID="cri-o://1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885" gracePeriod=600 Apr 22 18:50:33.477396 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.477339 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy-web" containerID="cri-o://e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273" gracePeriod=600 Apr 22 18:50:33.477396 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.477242 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="prometheus" containerID="cri-o://0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d" gracePeriod=600 Apr 22 18:50:33.477396 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.477312 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce" gracePeriod=600 Apr 22 18:50:33.731810 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.731740 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:33.816565 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816526 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-kubelet-serving-ca-bundle\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.816748 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816607 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-trusted-ca-bundle\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.816748 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816632 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-rulefiles-0\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.816748 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816675 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-tls-assets\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.816748 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816699 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-web-config\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.816972 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816836 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-kube-rbac-proxy\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.816972 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816880 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-thanos-prometheus-http-client-file\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.816972 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816915 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-serving-certs-ca-bundle\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.816972 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816944 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.816972 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816971 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-tls\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.817246 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816995 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-db\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.817246 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.816992 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:33.817246 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817023 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-metrics-client-ca\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.817246 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817050 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:33.817246 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817060 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-metrics-client-certs\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.817246 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817120 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config-out\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.817246 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817159 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.817246 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817198 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.817627 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817254 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-grpc-tls\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.817627 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817300 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdqqb\" (UniqueName: \"kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-kube-api-access-jdqqb\") pod \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\" (UID: \"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02\") " Apr 22 18:50:33.817627 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817557 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.817627 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817575 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-trusted-ca-bundle\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.817826 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817799 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:33.818230 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.817949 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:33.818863 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.818807 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:33.819021 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.818991 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:50:33.819721 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.819693 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:33.819821 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.819734 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:33.820125 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.820087 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:33.820537 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.820497 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-kube-api-access-jdqqb" (OuterVolumeSpecName: "kube-api-access-jdqqb") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "kube-api-access-jdqqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:33.820733 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.820704 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:50:33.821301 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.821270 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:33.821754 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.821718 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:33.821754 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.821738 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:33.821864 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.821759 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:33.822102 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.822087 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config" (OuterVolumeSpecName: "config") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:33.822498 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.822483 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config-out" (OuterVolumeSpecName: "config-out") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:50:33.832263 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.832230 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-web-config" (OuterVolumeSpecName: "web-config") pod "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" (UID: "a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:50:33.918725 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918696 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-kube-rbac-proxy\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918725 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918721 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-thanos-prometheus-http-client-file\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918725 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918732 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918741 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918751 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-tls\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918761 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-db\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918770 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-configmap-metrics-client-ca\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918778 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-metrics-client-certs\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918787 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config-out\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918796 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918805 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-config\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918815 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-secret-grpc-tls\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918823 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jdqqb\" (UniqueName: \"kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-kube-api-access-jdqqb\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918831 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918841 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-tls-assets\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:33.918964 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:33.918851 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02-web-config\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:50:34.199402 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199369 2575 generic.go:358] "Generic (PLEG): container finished" podID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerID="9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce" exitCode=0 Apr 22 18:50:34.199402 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199395 2575 generic.go:358] "Generic (PLEG): container finished" podID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerID="2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952" exitCode=0 Apr 22 18:50:34.199402 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199401 2575 generic.go:358] "Generic (PLEG): container finished" podID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerID="e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273" exitCode=0 Apr 22 18:50:34.199402 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199407 2575 generic.go:358] "Generic (PLEG): container finished" podID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerID="cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc" exitCode=0 Apr 22 18:50:34.199402 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199411 2575 generic.go:358] "Generic (PLEG): container finished" podID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerID="1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885" exitCode=0 Apr 22 18:50:34.199402 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199416 2575 generic.go:358] "Generic (PLEG): container finished" podID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerID="0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d" exitCode=0 Apr 22 18:50:34.199909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199454 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerDied","Data":"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce"} Apr 22 18:50:34.199909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199493 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerDied","Data":"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952"} Apr 22 18:50:34.199909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199509 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.199909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199524 2575 scope.go:117] "RemoveContainer" containerID="9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce" Apr 22 18:50:34.199909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199513 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerDied","Data":"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273"} Apr 22 18:50:34.199909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199628 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerDied","Data":"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc"} Apr 22 18:50:34.199909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199642 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerDied","Data":"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885"} Apr 22 18:50:34.199909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199652 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerDied","Data":"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d"} Apr 22 18:50:34.199909 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.199663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02","Type":"ContainerDied","Data":"a0a7256bca79617d0a8d82e7a3e6fd5ab766056a577afc0e01edac65898ef224"} Apr 22 18:50:34.207505 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.207407 2575 scope.go:117] "RemoveContainer" containerID="2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952" Apr 22 18:50:34.214710 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.214689 2575 scope.go:117] "RemoveContainer" containerID="e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273" Apr 22 18:50:34.221529 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.221499 2575 scope.go:117] "RemoveContainer" containerID="cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc" Apr 22 18:50:34.223051 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.223030 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:50:34.227185 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.227161 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:50:34.229348 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.229328 2575 scope.go:117] "RemoveContainer" containerID="1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885" Apr 22 18:50:34.235972 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.235954 2575 scope.go:117] "RemoveContainer" containerID="0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d" Apr 22 18:50:34.242905 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.242886 2575 scope.go:117] "RemoveContainer" containerID="15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527" Apr 22 18:50:34.249563 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.249535 2575 scope.go:117] "RemoveContainer" containerID="9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce" Apr 22 18:50:34.249825 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:50:34.249805 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": container with ID starting with 9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce not found: ID does not exist" containerID="9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce" Apr 22 18:50:34.249872 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.249835 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce"} err="failed to get container status \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": rpc error: code = NotFound desc = could not find container \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": container with ID starting with 9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce not found: ID does not exist" Apr 22 18:50:34.249872 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.249853 2575 scope.go:117] "RemoveContainer" containerID="2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952" Apr 22 18:50:34.250093 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:50:34.250075 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": container with ID starting with 2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952 not found: ID does not exist" containerID="2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952" Apr 22 18:50:34.250132 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.250108 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952"} err="failed to get container status \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": rpc error: code = NotFound desc = could not find container \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": container with ID starting with 2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952 not found: ID does not exist" Apr 22 18:50:34.250132 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.250125 2575 scope.go:117] "RemoveContainer" containerID="e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273" Apr 22 18:50:34.250409 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:50:34.250392 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": container with ID starting with e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273 not found: ID does not exist" containerID="e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273" Apr 22 18:50:34.250478 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.250413 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273"} err="failed to get container status \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": rpc error: code = NotFound desc = could not find container \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": container with ID starting with e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273 not found: ID does not exist" Apr 22 18:50:34.250478 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.250427 2575 scope.go:117] "RemoveContainer" containerID="cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc" Apr 22 18:50:34.250671 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:50:34.250652 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": container with ID starting with cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc not found: ID does not exist" containerID="cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc" Apr 22 18:50:34.250714 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.250676 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc"} err="failed to get container status \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": rpc error: code = NotFound desc = could not find container \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": container with ID starting with cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc not found: ID does not exist" Apr 22 18:50:34.250714 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.250692 2575 scope.go:117] "RemoveContainer" containerID="1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885" Apr 22 18:50:34.250912 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:50:34.250896 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": container with ID starting with 1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885 not found: ID does not exist" containerID="1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885" Apr 22 18:50:34.250950 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.250916 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885"} err="failed to get container status \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": rpc error: code = NotFound desc = could not find container \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": container with ID starting with 1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885 not found: ID does not exist" Apr 22 18:50:34.250950 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.250931 2575 scope.go:117] "RemoveContainer" containerID="0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d" Apr 22 18:50:34.251146 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:50:34.251126 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": container with ID starting with 0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d not found: ID does not exist" containerID="0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d" Apr 22 18:50:34.251196 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.251157 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d"} err="failed to get container status \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": rpc error: code = NotFound desc = could not find container \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": container with ID starting with 0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d not found: ID does not exist" Apr 22 18:50:34.251196 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.251170 2575 scope.go:117] "RemoveContainer" containerID="15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527" Apr 22 18:50:34.251421 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:50:34.251402 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": container with ID starting with 15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527 not found: ID does not exist" containerID="15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527" Apr 22 18:50:34.251461 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.251424 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527"} err="failed to get container status \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": rpc error: code = NotFound desc = could not find container \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": container with ID starting with 15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527 not found: ID does not exist" Apr 22 18:50:34.251461 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.251437 2575 scope.go:117] "RemoveContainer" containerID="9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce" Apr 22 18:50:34.251644 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.251626 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce"} err="failed to get container status \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": rpc error: code = NotFound desc = could not find container \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": container with ID starting with 9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce not found: ID does not exist" Apr 22 18:50:34.251702 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.251645 2575 scope.go:117] "RemoveContainer" containerID="2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952" Apr 22 18:50:34.251942 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.251871 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952"} err="failed to get container status \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": rpc error: code = NotFound desc = could not find container \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": container with ID starting with 2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952 not found: ID does not exist" Apr 22 18:50:34.251942 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.251900 2575 scope.go:117] "RemoveContainer" containerID="e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273" Apr 22 18:50:34.252361 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.252155 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273"} err="failed to get container status \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": rpc error: code = NotFound desc = could not find container \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": container with ID starting with e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273 not found: ID does not exist" Apr 22 18:50:34.252361 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.252182 2575 scope.go:117] "RemoveContainer" containerID="cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc" Apr 22 18:50:34.252506 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.252466 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc"} err="failed to get container status \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": rpc error: code = NotFound desc = could not find container \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": container with ID starting with cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc not found: ID does not exist" Apr 22 18:50:34.252506 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.252490 2575 scope.go:117] "RemoveContainer" containerID="1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885" Apr 22 18:50:34.252814 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.252789 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885"} err="failed to get container status \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": rpc error: code = NotFound desc = could not find container \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": container with ID starting with 1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885 not found: ID does not exist" Apr 22 18:50:34.252814 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.252813 2575 scope.go:117] "RemoveContainer" containerID="0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d" Apr 22 18:50:34.253098 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.253075 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d"} err="failed to get container status \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": rpc error: code = NotFound desc = could not find container \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": container with ID starting with 0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d not found: ID does not exist" Apr 22 18:50:34.253189 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.253102 2575 scope.go:117] "RemoveContainer" containerID="15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527" Apr 22 18:50:34.253393 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.253369 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527"} err="failed to get container status \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": rpc error: code = NotFound desc = could not find container \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": container with ID starting with 15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527 not found: ID does not exist" Apr 22 18:50:34.253440 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.253395 2575 scope.go:117] "RemoveContainer" containerID="9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce" Apr 22 18:50:34.253669 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.253645 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce"} err="failed to get container status \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": rpc error: code = NotFound desc = could not find container \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": container with ID starting with 9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce not found: ID does not exist" Apr 22 18:50:34.253742 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.253671 2575 scope.go:117] "RemoveContainer" containerID="2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952" Apr 22 18:50:34.253926 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.253908 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952"} err="failed to get container status \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": rpc error: code = NotFound desc = could not find container \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": container with ID starting with 2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952 not found: ID does not exist" Apr 22 18:50:34.253989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.253927 2575 scope.go:117] "RemoveContainer" containerID="e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273" Apr 22 18:50:34.254043 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254006 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:50:34.254165 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254148 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273"} err="failed to get container status \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": rpc error: code = NotFound desc = could not find container \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": container with ID starting with e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273 not found: ID does not exist" Apr 22 18:50:34.254266 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254165 2575 scope.go:117] "RemoveContainer" containerID="cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc" Apr 22 18:50:34.254434 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254408 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy-web" Apr 22 18:50:34.254434 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254429 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy-web" Apr 22 18:50:34.254515 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254443 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="config-reloader" Apr 22 18:50:34.254515 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254449 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="config-reloader" Apr 22 18:50:34.254515 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254455 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy" Apr 22 18:50:34.254515 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254449 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc"} err="failed to get container status \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": rpc error: code = NotFound desc = could not find container \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": container with ID starting with cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc not found: ID does not exist" Apr 22 18:50:34.254515 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254471 2575 scope.go:117] "RemoveContainer" containerID="1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885" Apr 22 18:50:34.254515 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254460 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254518 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3d6b355-22f1-4db5-a332-667de73ccfa1" containerName="registry" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254526 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d6b355-22f1-4db5-a332-667de73ccfa1" containerName="registry" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254534 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy-thanos" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254540 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy-thanos" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254547 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="thanos-sidecar" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254553 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="thanos-sidecar" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254567 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="prometheus" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254575 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="prometheus" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254606 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="init-config-reloader" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254615 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="init-config-reloader" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254696 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy-web" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254707 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy-thanos" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254716 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="kube-rbac-proxy" Apr 22 18:50:34.254717 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254715 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885"} err="failed to get container status \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": rpc error: code = NotFound desc = could not find container \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": container with ID starting with 1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885 not found: ID does not exist" Apr 22 18:50:34.255432 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254726 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3d6b355-22f1-4db5-a332-667de73ccfa1" containerName="registry" Apr 22 18:50:34.255432 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254730 2575 scope.go:117] "RemoveContainer" containerID="0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d" Apr 22 18:50:34.255432 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254736 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="config-reloader" Apr 22 18:50:34.255432 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254742 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="prometheus" Apr 22 18:50:34.255432 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254749 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" containerName="thanos-sidecar" Apr 22 18:50:34.255432 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254952 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d"} err="failed to get container status \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": rpc error: code = NotFound desc = could not find container \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": container with ID starting with 0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d not found: ID does not exist" Apr 22 18:50:34.255432 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.254976 2575 scope.go:117] "RemoveContainer" containerID="15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527" Apr 22 18:50:34.255790 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.255673 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527"} err="failed to get container status \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": rpc error: code = NotFound desc = could not find container \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": container with ID starting with 15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527 not found: ID does not exist" Apr 22 18:50:34.255790 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.255700 2575 scope.go:117] "RemoveContainer" containerID="9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce" Apr 22 18:50:34.256080 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.256045 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce"} err="failed to get container status \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": rpc error: code = NotFound desc = could not find container \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": container with ID starting with 9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce not found: ID does not exist" Apr 22 18:50:34.256080 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.256069 2575 scope.go:117] "RemoveContainer" containerID="2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952" Apr 22 18:50:34.256558 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.256528 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952"} err="failed to get container status \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": rpc error: code = NotFound desc = could not find container \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": container with ID starting with 2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952 not found: ID does not exist" Apr 22 18:50:34.256665 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.256560 2575 scope.go:117] "RemoveContainer" containerID="e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273" Apr 22 18:50:34.262685 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.261055 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273"} err="failed to get container status \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": rpc error: code = NotFound desc = could not find container \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": container with ID starting with e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273 not found: ID does not exist" Apr 22 18:50:34.262685 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.261086 2575 scope.go:117] "RemoveContainer" containerID="cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc" Apr 22 18:50:34.262910 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.262890 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.263019 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.262934 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc"} err="failed to get container status \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": rpc error: code = NotFound desc = could not find container \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": container with ID starting with cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc not found: ID does not exist" Apr 22 18:50:34.263019 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.262956 2575 scope.go:117] "RemoveContainer" containerID="1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885" Apr 22 18:50:34.263325 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.263293 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885"} err="failed to get container status \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": rpc error: code = NotFound desc = could not find container \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": container with ID starting with 1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885 not found: ID does not exist" Apr 22 18:50:34.263422 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.263327 2575 scope.go:117] "RemoveContainer" containerID="0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d" Apr 22 18:50:34.263696 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.263674 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d"} err="failed to get container status \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": rpc error: code = NotFound desc = could not find container \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": container with ID starting with 0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d not found: ID does not exist" Apr 22 18:50:34.263773 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.263698 2575 scope.go:117] "RemoveContainer" containerID="15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527" Apr 22 18:50:34.263947 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.263927 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527"} err="failed to get container status \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": rpc error: code = NotFound desc = could not find container \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": container with ID starting with 15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527 not found: ID does not exist" Apr 22 18:50:34.264002 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.263948 2575 scope.go:117] "RemoveContainer" containerID="9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce" Apr 22 18:50:34.264195 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.264178 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce"} err="failed to get container status \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": rpc error: code = NotFound desc = could not find container \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": container with ID starting with 9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce not found: ID does not exist" Apr 22 18:50:34.264273 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.264195 2575 scope.go:117] "RemoveContainer" containerID="2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952" Apr 22 18:50:34.264419 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.264395 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952"} err="failed to get container status \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": rpc error: code = NotFound desc = could not find container \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": container with ID starting with 2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952 not found: ID does not exist" Apr 22 18:50:34.264419 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.264419 2575 scope.go:117] "RemoveContainer" containerID="e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273" Apr 22 18:50:34.264692 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.264675 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273"} err="failed to get container status \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": rpc error: code = NotFound desc = could not find container \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": container with ID starting with e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273 not found: ID does not exist" Apr 22 18:50:34.264692 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.264691 2575 scope.go:117] "RemoveContainer" containerID="cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc" Apr 22 18:50:34.264907 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.264889 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc"} err="failed to get container status \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": rpc error: code = NotFound desc = could not find container \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": container with ID starting with cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc not found: ID does not exist" Apr 22 18:50:34.264952 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.264906 2575 scope.go:117] "RemoveContainer" containerID="1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885" Apr 22 18:50:34.265142 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265124 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885"} err="failed to get container status \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": rpc error: code = NotFound desc = could not find container \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": container with ID starting with 1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885 not found: ID does not exist" Apr 22 18:50:34.265198 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265143 2575 scope.go:117] "RemoveContainer" containerID="0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d" Apr 22 18:50:34.265389 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265368 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d"} err="failed to get container status \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": rpc error: code = NotFound desc = could not find container \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": container with ID starting with 0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d not found: ID does not exist" Apr 22 18:50:34.265470 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265392 2575 scope.go:117] "RemoveContainer" containerID="15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527" Apr 22 18:50:34.265639 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265617 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527"} err="failed to get container status \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": rpc error: code = NotFound desc = could not find container \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": container with ID starting with 15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527 not found: ID does not exist" Apr 22 18:50:34.265639 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265639 2575 scope.go:117] "RemoveContainer" containerID="9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce" Apr 22 18:50:34.265915 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265897 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:50:34.266019 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265921 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:50:34.266019 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265927 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:50:34.266019 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265973 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:50:34.266253 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266077 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:50:34.266253 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266080 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:50:34.266253 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.265901 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce"} err="failed to get container status \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": rpc error: code = NotFound desc = could not find container \"9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce\": container with ID starting with 9f86f024d920959415a4226711af9a063e30f0dd6939cfff8c2c76103ee3f0ce not found: ID does not exist" Apr 22 18:50:34.266413 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266259 2575 scope.go:117] "RemoveContainer" containerID="2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952" Apr 22 18:50:34.266413 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266240 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7vugm3i6c7sme\"" Apr 22 18:50:34.266514 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266424 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:50:34.266565 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266527 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:50:34.266565 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266525 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952"} err="failed to get container status \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": rpc error: code = NotFound desc = could not find container \"2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952\": container with ID starting with 2dbbf6c4dd7cba1689cb4bf17b6100655712412f0567494225d4c03ba816b952 not found: ID does not exist" Apr 22 18:50:34.266565 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266547 2575 scope.go:117] "RemoveContainer" containerID="e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273" Apr 22 18:50:34.266565 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:50:34.266756 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266566 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:50:34.266756 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266682 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fxthp\"" Apr 22 18:50:34.266862 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266793 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273"} err="failed to get container status \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": rpc error: code = NotFound desc = could not find container \"e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273\": container with ID starting with e2170a8a592e6bc063df0f56cdc6b7b21c71f4324dab8456255e2a136c319273 not found: ID does not exist" Apr 22 18:50:34.266862 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266815 2575 scope.go:117] "RemoveContainer" containerID="cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc" Apr 22 18:50:34.266969 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.266875 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:50:34.267025 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.267001 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc"} err="failed to get container status \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": rpc error: code = NotFound desc = could not find container \"cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc\": container with ID starting with cb8051d8b8d2bedfa63a74f573950d0a9fb91c1622d370b917c345ae56fdacdc not found: ID does not exist" Apr 22 18:50:34.267025 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.267022 2575 scope.go:117] "RemoveContainer" containerID="1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885" Apr 22 18:50:34.267395 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.267361 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885"} err="failed to get container status \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": rpc error: code = NotFound desc = could not find container \"1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885\": container with ID starting with 1f70865aed9e0e138c595e1a173cee086967cb15bbbd984397b0d9881ba50885 not found: ID does not exist" Apr 22 18:50:34.267478 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.267395 2575 scope.go:117] "RemoveContainer" containerID="0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d" Apr 22 18:50:34.267819 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.267791 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d"} err="failed to get container status \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": rpc error: code = NotFound desc = could not find container \"0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d\": container with ID starting with 0cfb01f6ff2d87b02f42ebfdb85dadcbd47c0f061b4be6ed3680a80d33be123d not found: ID does not exist" Apr 22 18:50:34.267819 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.267819 2575 scope.go:117] "RemoveContainer" containerID="15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527" Apr 22 18:50:34.268300 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.268194 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527"} err="failed to get container status \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": rpc error: code = NotFound desc = could not find container \"15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527\": container with ID starting with 15770661c315a42406282c4d304f0979d09c9ed263ff289c78aa8d4995b0d527 not found: ID does not exist" Apr 22 18:50:34.271930 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.271874 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:50:34.272593 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.272572 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:50:34.273654 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.273632 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:50:34.322220 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322416 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322260 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-config\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322416 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322416 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322302 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322416 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322632 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322408 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322632 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322451 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322632 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322486 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk9wd\" (UniqueName: \"kubernetes.io/projected/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-kube-api-access-wk9wd\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322632 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322632 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322559 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322632 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322895 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322667 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322895 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322895 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322760 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322895 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322784 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322895 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322799 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322895 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.322895 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.322864 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.423875 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.423820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.423875 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.423874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424156 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.423895 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424156 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.423913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424156 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.423940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424156 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.423967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424156 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.423993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424156 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.424022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424156 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.424069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424994 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.424828 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424994 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.424921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424994 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.424923 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.424994 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.424976 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.425460 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.425021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-config\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.425460 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.425064 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.425460 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.425279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.425460 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.425315 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.425460 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.425348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.425460 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.425380 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.425460 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.425411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk9wd\" (UniqueName: \"kubernetes.io/projected/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-kube-api-access-wk9wd\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.425847 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.425472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.425847 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.425583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.425847 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.425810 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.427058 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.427026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.427171 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.427125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-web-config\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.427171 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.427159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.427872 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.427851 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.428244 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.428220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.428389 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.428370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.428959 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.428935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.429182 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.429164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-config-out\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.429724 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.429694 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.429790 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.429728 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.429953 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.429935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.430117 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.430101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-config\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.435553 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.435529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk9wd\" (UniqueName: \"kubernetes.io/projected/c94c8fa3-6c4d-4510-a858-7f4981f6f54b-kube-api-access-wk9wd\") pod \"prometheus-k8s-0\" (UID: \"c94c8fa3-6c4d-4510-a858-7f4981f6f54b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.574584 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.574530 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:50:34.705718 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:34.705694 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:50:34.707876 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:50:34.707842 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc94c8fa3_6c4d_4510_a858_7f4981f6f54b.slice/crio-fc8be02d9490311fc93cc6d786714eb141beae1ac4cf7b66537158b8a477b182 WatchSource:0}: Error finding container fc8be02d9490311fc93cc6d786714eb141beae1ac4cf7b66537158b8a477b182: Status 404 returned error can't find the container with id fc8be02d9490311fc93cc6d786714eb141beae1ac4cf7b66537158b8a477b182 Apr 22 18:50:35.203952 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:35.203915 2575 generic.go:358] "Generic (PLEG): container finished" podID="c94c8fa3-6c4d-4510-a858-7f4981f6f54b" containerID="407f104855725a1c0159265c8a724a24a5668380c27e85fa8e481c2c365974ef" exitCode=0 Apr 22 18:50:35.204143 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:35.204007 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c94c8fa3-6c4d-4510-a858-7f4981f6f54b","Type":"ContainerDied","Data":"407f104855725a1c0159265c8a724a24a5668380c27e85fa8e481c2c365974ef"} Apr 22 18:50:35.204143 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:35.204046 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c94c8fa3-6c4d-4510-a858-7f4981f6f54b","Type":"ContainerStarted","Data":"fc8be02d9490311fc93cc6d786714eb141beae1ac4cf7b66537158b8a477b182"} Apr 22 18:50:35.469086 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:35.469057 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02" path="/var/lib/kubelet/pods/a73f90a8-4fcc-4bf4-84c0-c52d6ebcee02/volumes" Apr 22 18:50:36.210592 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:36.210557 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c94c8fa3-6c4d-4510-a858-7f4981f6f54b","Type":"ContainerStarted","Data":"3c1646f383026569e9109c1b6d9e958ac540cce08992872b228251b643f99c48"} Apr 22 18:50:36.210592 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:36.210598 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c94c8fa3-6c4d-4510-a858-7f4981f6f54b","Type":"ContainerStarted","Data":"2da9ef115f9bb5e1024f7c6a61ea7c51de294aff0600f5619861277b16dbced0"} Apr 22 18:50:36.210987 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:36.210612 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c94c8fa3-6c4d-4510-a858-7f4981f6f54b","Type":"ContainerStarted","Data":"f64395bcbb251992caf60f7909d7e6b47065a0a11cdda7d70ab318ccecb71b30"} Apr 22 18:50:36.210987 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:36.210625 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c94c8fa3-6c4d-4510-a858-7f4981f6f54b","Type":"ContainerStarted","Data":"a20a253e14237cba32ae32f4a14189f38a38fe8a32db79c7831d50dbedd093f6"} Apr 22 18:50:36.210987 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:36.210637 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c94c8fa3-6c4d-4510-a858-7f4981f6f54b","Type":"ContainerStarted","Data":"69eb7c367d57f43d62ab59626ab4b2f88c750e4761c60fa4f3b6f9166d0ff6b9"} Apr 22 18:50:36.210987 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:36.210648 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c94c8fa3-6c4d-4510-a858-7f4981f6f54b","Type":"ContainerStarted","Data":"a9b94067f4129bc7abce4d7094d37b5850a91e1fc246cdffd0ad6a464d784f27"} Apr 22 18:50:36.237511 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:36.237458 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.237441708 podStartE2EDuration="2.237441708s" podCreationTimestamp="2026-04-22 18:50:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:50:36.235449844 +0000 UTC m=+233.297934176" watchObservedRunningTime="2026-04-22 18:50:36.237441708 +0000 UTC m=+233.299926039" Apr 22 18:50:39.575198 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:50:39.575160 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:34.574989 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:51:34.574934 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:34.590764 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:51:34.590736 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:35.401247 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:51:35.401195 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:51:43.387281 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:51:43.387247 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 18:51:43.387809 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:51:43.387487 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 18:51:43.394078 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:51:43.394053 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:53:25.321706 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.321667 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-k7vfs"] Apr 22 18:53:25.325375 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.325350 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-k7vfs" Apr 22 18:53:25.328062 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.328040 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-rgx9p\"" Apr 22 18:53:25.328062 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.328059 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:53:25.328255 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.328090 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:53:25.329127 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.329111 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:53:25.333349 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.333327 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-k7vfs"] Apr 22 18:53:25.472676 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.472642 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wqxb\" (UniqueName: \"kubernetes.io/projected/7e3b6b69-4654-4a1c-af31-d0c26f95319a-kube-api-access-2wqxb\") pod \"seaweedfs-86cc847c5c-k7vfs\" (UID: \"7e3b6b69-4654-4a1c-af31-d0c26f95319a\") " pod="kserve/seaweedfs-86cc847c5c-k7vfs" Apr 22 18:53:25.472848 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.472693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7e3b6b69-4654-4a1c-af31-d0c26f95319a-data\") pod \"seaweedfs-86cc847c5c-k7vfs\" (UID: \"7e3b6b69-4654-4a1c-af31-d0c26f95319a\") " pod="kserve/seaweedfs-86cc847c5c-k7vfs" Apr 22 18:53:25.573463 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.573376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wqxb\" (UniqueName: \"kubernetes.io/projected/7e3b6b69-4654-4a1c-af31-d0c26f95319a-kube-api-access-2wqxb\") pod \"seaweedfs-86cc847c5c-k7vfs\" (UID: \"7e3b6b69-4654-4a1c-af31-d0c26f95319a\") " pod="kserve/seaweedfs-86cc847c5c-k7vfs" Apr 22 18:53:25.573463 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.573421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7e3b6b69-4654-4a1c-af31-d0c26f95319a-data\") pod \"seaweedfs-86cc847c5c-k7vfs\" (UID: \"7e3b6b69-4654-4a1c-af31-d0c26f95319a\") " pod="kserve/seaweedfs-86cc847c5c-k7vfs" Apr 22 18:53:25.573806 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.573787 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/7e3b6b69-4654-4a1c-af31-d0c26f95319a-data\") pod \"seaweedfs-86cc847c5c-k7vfs\" (UID: \"7e3b6b69-4654-4a1c-af31-d0c26f95319a\") " pod="kserve/seaweedfs-86cc847c5c-k7vfs" Apr 22 18:53:25.581678 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.581648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wqxb\" (UniqueName: \"kubernetes.io/projected/7e3b6b69-4654-4a1c-af31-d0c26f95319a-kube-api-access-2wqxb\") pod \"seaweedfs-86cc847c5c-k7vfs\" (UID: \"7e3b6b69-4654-4a1c-af31-d0c26f95319a\") " pod="kserve/seaweedfs-86cc847c5c-k7vfs" Apr 22 18:53:25.634699 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.634668 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-k7vfs" Apr 22 18:53:25.755874 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.755839 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-k7vfs"] Apr 22 18:53:25.758758 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:53:25.758720 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e3b6b69_4654_4a1c_af31_d0c26f95319a.slice/crio-243737b09644f735255b9358bd8d54d3c54b7189e4333fc9e3ef6f6d701eeb29 WatchSource:0}: Error finding container 243737b09644f735255b9358bd8d54d3c54b7189e4333fc9e3ef6f6d701eeb29: Status 404 returned error can't find the container with id 243737b09644f735255b9358bd8d54d3c54b7189e4333fc9e3ef6f6d701eeb29 Apr 22 18:53:25.759902 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:25.759881 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:53:26.710426 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:26.710380 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-k7vfs" event={"ID":"7e3b6b69-4654-4a1c-af31-d0c26f95319a","Type":"ContainerStarted","Data":"243737b09644f735255b9358bd8d54d3c54b7189e4333fc9e3ef6f6d701eeb29"} Apr 22 18:53:28.718038 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:28.717999 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-k7vfs" event={"ID":"7e3b6b69-4654-4a1c-af31-d0c26f95319a","Type":"ContainerStarted","Data":"6989191d55f3fb1ee172d91be7eec041411693bdc85747d3756f580b69246e32"} Apr 22 18:53:28.718480 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:28.718146 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-k7vfs" Apr 22 18:53:28.733888 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:28.733840 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-k7vfs" podStartSLOduration=1.227159198 podStartE2EDuration="3.733824187s" podCreationTimestamp="2026-04-22 18:53:25 +0000 UTC" firstStartedPulling="2026-04-22 18:53:25.760000411 +0000 UTC m=+402.822484720" lastFinishedPulling="2026-04-22 18:53:28.266665396 +0000 UTC m=+405.329149709" observedRunningTime="2026-04-22 18:53:28.732621895 +0000 UTC m=+405.795106225" watchObservedRunningTime="2026-04-22 18:53:28.733824187 +0000 UTC m=+405.796308522" Apr 22 18:53:34.723301 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:53:34.723263 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-k7vfs" Apr 22 18:54:00.743354 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.743316 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gsgbt"] Apr 22 18:54:00.748035 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.748012 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" Apr 22 18:54:00.751703 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.751676 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-jm5vc\"" Apr 22 18:54:00.751845 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.751682 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 18:54:00.756340 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.756312 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gsgbt"] Apr 22 18:54:00.863233 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.863155 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2vvb\" (UniqueName: \"kubernetes.io/projected/9e48487b-20ef-406b-8bd0-d644eb493e21-kube-api-access-m2vvb\") pod \"kserve-controller-manager-6f655776dd-gsgbt\" (UID: \"9e48487b-20ef-406b-8bd0-d644eb493e21\") " pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" Apr 22 18:54:00.863414 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.863249 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e48487b-20ef-406b-8bd0-d644eb493e21-cert\") pod \"kserve-controller-manager-6f655776dd-gsgbt\" (UID: \"9e48487b-20ef-406b-8bd0-d644eb493e21\") " pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" Apr 22 18:54:00.964566 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.964532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2vvb\" (UniqueName: \"kubernetes.io/projected/9e48487b-20ef-406b-8bd0-d644eb493e21-kube-api-access-m2vvb\") pod \"kserve-controller-manager-6f655776dd-gsgbt\" (UID: \"9e48487b-20ef-406b-8bd0-d644eb493e21\") " pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" Apr 22 18:54:00.964566 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.964574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e48487b-20ef-406b-8bd0-d644eb493e21-cert\") pod \"kserve-controller-manager-6f655776dd-gsgbt\" (UID: \"9e48487b-20ef-406b-8bd0-d644eb493e21\") " pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" Apr 22 18:54:00.967138 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.967112 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e48487b-20ef-406b-8bd0-d644eb493e21-cert\") pod \"kserve-controller-manager-6f655776dd-gsgbt\" (UID: \"9e48487b-20ef-406b-8bd0-d644eb493e21\") " pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" Apr 22 18:54:00.972746 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:00.972715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2vvb\" (UniqueName: \"kubernetes.io/projected/9e48487b-20ef-406b-8bd0-d644eb493e21-kube-api-access-m2vvb\") pod \"kserve-controller-manager-6f655776dd-gsgbt\" (UID: \"9e48487b-20ef-406b-8bd0-d644eb493e21\") " pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" Apr 22 18:54:01.059278 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:01.059233 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" Apr 22 18:54:01.187192 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:01.187165 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6f655776dd-gsgbt"] Apr 22 18:54:01.189588 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:54:01.189558 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e48487b_20ef_406b_8bd0_d644eb493e21.slice/crio-c32b180acaa7b3cc228d816eebe0c950801e243c4247b049c4d5a6163dd3694f WatchSource:0}: Error finding container c32b180acaa7b3cc228d816eebe0c950801e243c4247b049c4d5a6163dd3694f: Status 404 returned error can't find the container with id c32b180acaa7b3cc228d816eebe0c950801e243c4247b049c4d5a6163dd3694f Apr 22 18:54:01.810798 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:01.810703 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" event={"ID":"9e48487b-20ef-406b-8bd0-d644eb493e21","Type":"ContainerStarted","Data":"c32b180acaa7b3cc228d816eebe0c950801e243c4247b049c4d5a6163dd3694f"} Apr 22 18:54:03.818409 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:03.818366 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" event={"ID":"9e48487b-20ef-406b-8bd0-d644eb493e21","Type":"ContainerStarted","Data":"c67a10d3ca828d30e69560ec1ddcb8579eab6799f69e6b2a21fbef185bcd9dba"} Apr 22 18:54:03.818840 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:03.818503 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" Apr 22 18:54:03.839861 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:03.839751 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" podStartSLOduration=1.442652882 podStartE2EDuration="3.839736814s" podCreationTimestamp="2026-04-22 18:54:00 +0000 UTC" firstStartedPulling="2026-04-22 18:54:01.191196873 +0000 UTC m=+438.253681181" lastFinishedPulling="2026-04-22 18:54:03.588280804 +0000 UTC m=+440.650765113" observedRunningTime="2026-04-22 18:54:03.83860168 +0000 UTC m=+440.901086012" watchObservedRunningTime="2026-04-22 18:54:03.839736814 +0000 UTC m=+440.902221200" Apr 22 18:54:34.827062 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:34.827028 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6f655776dd-gsgbt" Apr 22 18:54:53.339504 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:53.339470 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-qxpn4"] Apr 22 18:54:53.342788 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:53.342770 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qxpn4" Apr 22 18:54:53.355029 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:53.354997 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-qxpn4"] Apr 22 18:54:53.412318 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:53.412277 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hwxx\" (UniqueName: \"kubernetes.io/projected/3de716eb-c772-414f-9e47-2f4ea744e76f-kube-api-access-6hwxx\") pod \"s3-init-qxpn4\" (UID: \"3de716eb-c772-414f-9e47-2f4ea744e76f\") " pod="kserve/s3-init-qxpn4" Apr 22 18:54:53.512818 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:53.512787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hwxx\" (UniqueName: \"kubernetes.io/projected/3de716eb-c772-414f-9e47-2f4ea744e76f-kube-api-access-6hwxx\") pod \"s3-init-qxpn4\" (UID: \"3de716eb-c772-414f-9e47-2f4ea744e76f\") " pod="kserve/s3-init-qxpn4" Apr 22 18:54:53.521598 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:53.521567 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hwxx\" (UniqueName: \"kubernetes.io/projected/3de716eb-c772-414f-9e47-2f4ea744e76f-kube-api-access-6hwxx\") pod \"s3-init-qxpn4\" (UID: \"3de716eb-c772-414f-9e47-2f4ea744e76f\") " pod="kserve/s3-init-qxpn4" Apr 22 18:54:53.666032 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:53.665942 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qxpn4" Apr 22 18:54:53.786949 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:53.786924 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-qxpn4"] Apr 22 18:54:53.789470 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:54:53.789431 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3de716eb_c772_414f_9e47_2f4ea744e76f.slice/crio-9d6356f121784c5da873958daa9005347a8185d90346ad947255a1117e002f71 WatchSource:0}: Error finding container 9d6356f121784c5da873958daa9005347a8185d90346ad947255a1117e002f71: Status 404 returned error can't find the container with id 9d6356f121784c5da873958daa9005347a8185d90346ad947255a1117e002f71 Apr 22 18:54:53.972153 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:53.972050 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qxpn4" event={"ID":"3de716eb-c772-414f-9e47-2f4ea744e76f","Type":"ContainerStarted","Data":"9d6356f121784c5da873958daa9005347a8185d90346ad947255a1117e002f71"} Apr 22 18:54:58.990654 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:58.990618 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qxpn4" event={"ID":"3de716eb-c772-414f-9e47-2f4ea744e76f","Type":"ContainerStarted","Data":"a91be72451b67012153815e1e20e7734462a04f5d0063fba349f430f68995689"} Apr 22 18:54:59.005761 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:54:59.005711 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-qxpn4" podStartSLOduration=1.5659095870000002 podStartE2EDuration="6.005697318s" podCreationTimestamp="2026-04-22 18:54:53 +0000 UTC" firstStartedPulling="2026-04-22 18:54:53.791177272 +0000 UTC m=+490.853661585" lastFinishedPulling="2026-04-22 18:54:58.230965007 +0000 UTC m=+495.293449316" observedRunningTime="2026-04-22 18:54:59.004606634 +0000 UTC m=+496.067090966" watchObservedRunningTime="2026-04-22 18:54:59.005697318 +0000 UTC m=+496.068181648" Apr 22 18:55:02.004864 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:02.004830 2575 generic.go:358] "Generic (PLEG): container finished" podID="3de716eb-c772-414f-9e47-2f4ea744e76f" containerID="a91be72451b67012153815e1e20e7734462a04f5d0063fba349f430f68995689" exitCode=0 Apr 22 18:55:02.005294 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:02.004882 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qxpn4" event={"ID":"3de716eb-c772-414f-9e47-2f4ea744e76f","Type":"ContainerDied","Data":"a91be72451b67012153815e1e20e7734462a04f5d0063fba349f430f68995689"} Apr 22 18:55:03.135270 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:03.135245 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qxpn4" Apr 22 18:55:03.303849 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:03.303811 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hwxx\" (UniqueName: \"kubernetes.io/projected/3de716eb-c772-414f-9e47-2f4ea744e76f-kube-api-access-6hwxx\") pod \"3de716eb-c772-414f-9e47-2f4ea744e76f\" (UID: \"3de716eb-c772-414f-9e47-2f4ea744e76f\") " Apr 22 18:55:03.306002 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:03.305973 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3de716eb-c772-414f-9e47-2f4ea744e76f-kube-api-access-6hwxx" (OuterVolumeSpecName: "kube-api-access-6hwxx") pod "3de716eb-c772-414f-9e47-2f4ea744e76f" (UID: "3de716eb-c772-414f-9e47-2f4ea744e76f"). InnerVolumeSpecName "kube-api-access-6hwxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:03.404912 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:03.404872 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hwxx\" (UniqueName: \"kubernetes.io/projected/3de716eb-c772-414f-9e47-2f4ea744e76f-kube-api-access-6hwxx\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:55:04.012445 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:04.012409 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qxpn4" event={"ID":"3de716eb-c772-414f-9e47-2f4ea744e76f","Type":"ContainerDied","Data":"9d6356f121784c5da873958daa9005347a8185d90346ad947255a1117e002f71"} Apr 22 18:55:04.012445 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:04.012444 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d6356f121784c5da873958daa9005347a8185d90346ad947255a1117e002f71" Apr 22 18:55:04.012445 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:04.012418 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qxpn4" Apr 22 18:55:13.876918 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.876879 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk"] Apr 22 18:55:13.877346 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.877198 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3de716eb-c772-414f-9e47-2f4ea744e76f" containerName="s3-init" Apr 22 18:55:13.877346 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.877228 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3de716eb-c772-414f-9e47-2f4ea744e76f" containerName="s3-init" Apr 22 18:55:13.877346 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.877294 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3de716eb-c772-414f-9e47-2f4ea744e76f" containerName="s3-init" Apr 22 18:55:13.880533 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.880516 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:13.884543 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.884509 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:55:13.884543 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.884548 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:55:13.884543 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.884510 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-predictor-serving-cert\"" Apr 22 18:55:13.884865 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.884510 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-n6c6z\"" Apr 22 18:55:13.884865 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.884510 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-graph-kube-rbac-proxy-sar-config\"" Apr 22 18:55:13.890877 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.890852 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk"] Apr 22 18:55:13.998243 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.998201 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/678591aa-8424-43f6-83e4-57d29c7eba30-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:13.998430 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.998278 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbdv5\" (UniqueName: \"kubernetes.io/projected/678591aa-8424-43f6-83e4-57d29c7eba30-kube-api-access-pbdv5\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:13.998430 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.998308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/678591aa-8424-43f6-83e4-57d29c7eba30-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:13.998430 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:13.998344 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/678591aa-8424-43f6-83e4-57d29c7eba30-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:14.099815 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:14.099764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/678591aa-8424-43f6-83e4-57d29c7eba30-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:14.100001 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:14.099845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbdv5\" (UniqueName: \"kubernetes.io/projected/678591aa-8424-43f6-83e4-57d29c7eba30-kube-api-access-pbdv5\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:14.100001 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:14.099884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/678591aa-8424-43f6-83e4-57d29c7eba30-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:14.100001 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:14.099935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/678591aa-8424-43f6-83e4-57d29c7eba30-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:14.100329 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:14.100308 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/678591aa-8424-43f6-83e4-57d29c7eba30-kserve-provision-location\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:14.100532 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:14.100511 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/678591aa-8424-43f6-83e4-57d29c7eba30-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:14.102599 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:14.102576 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/678591aa-8424-43f6-83e4-57d29c7eba30-proxy-tls\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:14.108113 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:14.108087 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbdv5\" (UniqueName: \"kubernetes.io/projected/678591aa-8424-43f6-83e4-57d29c7eba30-kube-api-access-pbdv5\") pod \"isvc-xgboost-graph-predictor-669d8d6456-msxjk\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:14.191837 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:14.191738 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:14.321938 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:14.321910 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk"] Apr 22 18:55:14.324231 ip-10-0-134-109 kubenswrapper[2575]: W0422 18:55:14.324182 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod678591aa_8424_43f6_83e4_57d29c7eba30.slice/crio-546dd405f33541b5f49ac15aacd649323845280228cfd0c2e14ed19b3a440e70 WatchSource:0}: Error finding container 546dd405f33541b5f49ac15aacd649323845280228cfd0c2e14ed19b3a440e70: Status 404 returned error can't find the container with id 546dd405f33541b5f49ac15aacd649323845280228cfd0c2e14ed19b3a440e70 Apr 22 18:55:15.048521 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:15.048488 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" event={"ID":"678591aa-8424-43f6-83e4-57d29c7eba30","Type":"ContainerStarted","Data":"546dd405f33541b5f49ac15aacd649323845280228cfd0c2e14ed19b3a440e70"} Apr 22 18:55:19.062846 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:19.062808 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" event={"ID":"678591aa-8424-43f6-83e4-57d29c7eba30","Type":"ContainerStarted","Data":"34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d"} Apr 22 18:55:23.076074 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:23.076043 2575 generic.go:358] "Generic (PLEG): container finished" podID="678591aa-8424-43f6-83e4-57d29c7eba30" containerID="34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d" exitCode=0 Apr 22 18:55:23.076484 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:23.076109 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" event={"ID":"678591aa-8424-43f6-83e4-57d29c7eba30","Type":"ContainerDied","Data":"34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d"} Apr 22 18:55:44.151590 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:44.151535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" event={"ID":"678591aa-8424-43f6-83e4-57d29c7eba30","Type":"ContainerStarted","Data":"c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589"} Apr 22 18:55:46.159873 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:46.159829 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" event={"ID":"678591aa-8424-43f6-83e4-57d29c7eba30","Type":"ContainerStarted","Data":"60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9"} Apr 22 18:55:46.160382 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:46.159987 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:46.179308 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:46.179258 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podStartSLOduration=1.991659303 podStartE2EDuration="33.179244246s" podCreationTimestamp="2026-04-22 18:55:13 +0000 UTC" firstStartedPulling="2026-04-22 18:55:14.326044317 +0000 UTC m=+511.388528627" lastFinishedPulling="2026-04-22 18:55:45.513629261 +0000 UTC m=+542.576113570" observedRunningTime="2026-04-22 18:55:46.177119147 +0000 UTC m=+543.239603478" watchObservedRunningTime="2026-04-22 18:55:46.179244246 +0000 UTC m=+543.241728576" Apr 22 18:55:47.163285 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:47.163250 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:47.164470 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:47.164438 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 18:55:48.165804 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:48.165763 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 18:55:53.170101 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:53.170072 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:55:53.170620 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:55:53.170593 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 18:56:03.170676 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:56:03.170633 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 18:56:13.170628 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:56:13.170590 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 18:56:23.171086 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:56:23.171041 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 18:56:33.171118 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:56:33.171076 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 18:56:43.171419 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:56:43.171371 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 22 18:56:43.410340 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:56:43.410306 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 18:56:43.411255 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:56:43.411228 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 18:56:53.171684 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:56:53.171655 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:57:23.565350 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:23.565316 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk"] Apr 22 18:57:23.566807 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:23.565802 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" containerID="cri-o://c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589" gracePeriod=30 Apr 22 18:57:23.566807 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:23.565897 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kube-rbac-proxy" containerID="cri-o://60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9" gracePeriod=30 Apr 22 18:57:24.458760 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:24.458724 2575 generic.go:358] "Generic (PLEG): container finished" podID="678591aa-8424-43f6-83e4-57d29c7eba30" containerID="60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9" exitCode=2 Apr 22 18:57:24.458933 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:24.458799 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" event={"ID":"678591aa-8424-43f6-83e4-57d29c7eba30","Type":"ContainerDied","Data":"60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9"} Apr 22 18:57:27.400343 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.400316 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:57:27.415099 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.415073 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/678591aa-8424-43f6-83e4-57d29c7eba30-kserve-provision-location\") pod \"678591aa-8424-43f6-83e4-57d29c7eba30\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " Apr 22 18:57:27.415276 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.415119 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/678591aa-8424-43f6-83e4-57d29c7eba30-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") pod \"678591aa-8424-43f6-83e4-57d29c7eba30\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " Apr 22 18:57:27.415276 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.415164 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/678591aa-8424-43f6-83e4-57d29c7eba30-proxy-tls\") pod \"678591aa-8424-43f6-83e4-57d29c7eba30\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " Apr 22 18:57:27.415276 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.415184 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbdv5\" (UniqueName: \"kubernetes.io/projected/678591aa-8424-43f6-83e4-57d29c7eba30-kube-api-access-pbdv5\") pod \"678591aa-8424-43f6-83e4-57d29c7eba30\" (UID: \"678591aa-8424-43f6-83e4-57d29c7eba30\") " Apr 22 18:57:27.415505 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.415478 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/678591aa-8424-43f6-83e4-57d29c7eba30-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "678591aa-8424-43f6-83e4-57d29c7eba30" (UID: "678591aa-8424-43f6-83e4-57d29c7eba30"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:57:27.415564 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.415521 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/678591aa-8424-43f6-83e4-57d29c7eba30-isvc-xgboost-graph-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-graph-kube-rbac-proxy-sar-config") pod "678591aa-8424-43f6-83e4-57d29c7eba30" (UID: "678591aa-8424-43f6-83e4-57d29c7eba30"). InnerVolumeSpecName "isvc-xgboost-graph-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:57:27.417466 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.417444 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/678591aa-8424-43f6-83e4-57d29c7eba30-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "678591aa-8424-43f6-83e4-57d29c7eba30" (UID: "678591aa-8424-43f6-83e4-57d29c7eba30"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:27.417621 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.417588 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678591aa-8424-43f6-83e4-57d29c7eba30-kube-api-access-pbdv5" (OuterVolumeSpecName: "kube-api-access-pbdv5") pod "678591aa-8424-43f6-83e4-57d29c7eba30" (UID: "678591aa-8424-43f6-83e4-57d29c7eba30"). InnerVolumeSpecName "kube-api-access-pbdv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:57:27.469743 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.469658 2575 generic.go:358] "Generic (PLEG): container finished" podID="678591aa-8424-43f6-83e4-57d29c7eba30" containerID="c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589" exitCode=0 Apr 22 18:57:27.469888 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.469822 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" Apr 22 18:57:27.470395 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.470372 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" event={"ID":"678591aa-8424-43f6-83e4-57d29c7eba30","Type":"ContainerDied","Data":"c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589"} Apr 22 18:57:27.470500 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.470404 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk" event={"ID":"678591aa-8424-43f6-83e4-57d29c7eba30","Type":"ContainerDied","Data":"546dd405f33541b5f49ac15aacd649323845280228cfd0c2e14ed19b3a440e70"} Apr 22 18:57:27.470500 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.470423 2575 scope.go:117] "RemoveContainer" containerID="60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9" Apr 22 18:57:27.479437 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.479223 2575 scope.go:117] "RemoveContainer" containerID="c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589" Apr 22 18:57:27.486907 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.486887 2575 scope.go:117] "RemoveContainer" containerID="34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d" Apr 22 18:57:27.493744 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.493722 2575 scope.go:117] "RemoveContainer" containerID="60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9" Apr 22 18:57:27.494040 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:57:27.494020 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9\": container with ID starting with 60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9 not found: ID does not exist" containerID="60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9" Apr 22 18:57:27.494131 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.494048 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9"} err="failed to get container status \"60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9\": rpc error: code = NotFound desc = could not find container \"60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9\": container with ID starting with 60760f1838132c395000bb75fdee8e28c20d2dac2ee207bb0f4a4ee74637c5e9 not found: ID does not exist" Apr 22 18:57:27.494131 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.494086 2575 scope.go:117] "RemoveContainer" containerID="c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589" Apr 22 18:57:27.494455 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:57:27.494413 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589\": container with ID starting with c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589 not found: ID does not exist" containerID="c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589" Apr 22 18:57:27.494597 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.494465 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589"} err="failed to get container status \"c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589\": rpc error: code = NotFound desc = could not find container \"c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589\": container with ID starting with c8bddb242f5bad7bfd78059729fcffaead28c87dcd2f0607d261fede3918a589 not found: ID does not exist" Apr 22 18:57:27.494597 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.494485 2575 scope.go:117] "RemoveContainer" containerID="34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d" Apr 22 18:57:27.494953 ip-10-0-134-109 kubenswrapper[2575]: E0422 18:57:27.494812 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d\": container with ID starting with 34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d not found: ID does not exist" containerID="34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d" Apr 22 18:57:27.494953 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.494843 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d"} err="failed to get container status \"34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d\": rpc error: code = NotFound desc = could not find container \"34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d\": container with ID starting with 34109ddaccd2e202b2addadc1567ef2be50e89a550f613685ee82781f30f580d not found: ID does not exist" Apr 22 18:57:27.496247 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.496226 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk"] Apr 22 18:57:27.499104 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.499082 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-predictor-669d8d6456-msxjk"] Apr 22 18:57:27.516629 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.516607 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pbdv5\" (UniqueName: \"kubernetes.io/projected/678591aa-8424-43f6-83e4-57d29c7eba30-kube-api-access-pbdv5\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:57:27.516629 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.516630 2575 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/678591aa-8424-43f6-83e4-57d29c7eba30-kserve-provision-location\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:57:27.516754 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.516642 2575 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-graph-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/678591aa-8424-43f6-83e4-57d29c7eba30-isvc-xgboost-graph-kube-rbac-proxy-sar-config\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:57:27.516754 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:27.516652 2575 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/678591aa-8424-43f6-83e4-57d29c7eba30-proxy-tls\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 18:57:29.469410 ip-10-0-134-109 kubenswrapper[2575]: I0422 18:57:29.469370 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" path="/var/lib/kubelet/pods/678591aa-8424-43f6-83e4-57d29c7eba30/volumes" Apr 22 19:01:43.435300 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:01:43.435197 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:01:43.437276 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:01:43.437250 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:06:43.457767 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:06:43.457731 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:06:43.461354 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:06:43.461329 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:11:43.480036 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:11:43.480009 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:11:43.484545 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:11:43.484527 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:16:43.501808 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:16:43.501773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:16:43.507604 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:16:43.507580 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:21:43.523863 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:21:43.523828 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:21:43.530904 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:21:43.530883 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:26:43.546319 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:26:43.546180 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:26:43.554247 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:26:43.554219 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:31:43.567816 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:31:43.567679 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:31:43.577308 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:31:43.577284 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:34:34.090187 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.090108 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m6qpv/must-gather-l4ng7"] Apr 22 19:34:34.090679 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.090481 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="storage-initializer" Apr 22 19:34:34.090679 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.090493 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="storage-initializer" Apr 22 19:34:34.090679 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.090504 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kube-rbac-proxy" Apr 22 19:34:34.090679 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.090510 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kube-rbac-proxy" Apr 22 19:34:34.090679 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.090519 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" Apr 22 19:34:34.090679 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.090525 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" Apr 22 19:34:34.090679 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.090597 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kube-rbac-proxy" Apr 22 19:34:34.090679 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.090608 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="678591aa-8424-43f6-83e4-57d29c7eba30" containerName="kserve-container" Apr 22 19:34:34.093696 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.093680 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" Apr 22 19:34:34.096254 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.096227 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-m6qpv\"/\"openshift-service-ca.crt\"" Apr 22 19:34:34.096254 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.096243 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-m6qpv\"/\"kube-root-ca.crt\"" Apr 22 19:34:34.097395 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.097379 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-m6qpv\"/\"default-dockercfg-zx9pc\"" Apr 22 19:34:34.106151 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.106120 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m6qpv/must-gather-l4ng7"] Apr 22 19:34:34.180796 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.180751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d6h8\" (UniqueName: \"kubernetes.io/projected/db68297f-00b7-4b1d-9df0-3912d92a62ff-kube-api-access-8d6h8\") pod \"must-gather-l4ng7\" (UID: \"db68297f-00b7-4b1d-9df0-3912d92a62ff\") " pod="openshift-must-gather-m6qpv/must-gather-l4ng7" Apr 22 19:34:34.180796 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.180798 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db68297f-00b7-4b1d-9df0-3912d92a62ff-must-gather-output\") pod \"must-gather-l4ng7\" (UID: \"db68297f-00b7-4b1d-9df0-3912d92a62ff\") " pod="openshift-must-gather-m6qpv/must-gather-l4ng7" Apr 22 19:34:34.281589 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.281550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8d6h8\" (UniqueName: \"kubernetes.io/projected/db68297f-00b7-4b1d-9df0-3912d92a62ff-kube-api-access-8d6h8\") pod \"must-gather-l4ng7\" (UID: \"db68297f-00b7-4b1d-9df0-3912d92a62ff\") " pod="openshift-must-gather-m6qpv/must-gather-l4ng7" Apr 22 19:34:34.281589 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.281589 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db68297f-00b7-4b1d-9df0-3912d92a62ff-must-gather-output\") pod \"must-gather-l4ng7\" (UID: \"db68297f-00b7-4b1d-9df0-3912d92a62ff\") " pod="openshift-must-gather-m6qpv/must-gather-l4ng7" Apr 22 19:34:34.281905 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.281889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db68297f-00b7-4b1d-9df0-3912d92a62ff-must-gather-output\") pod \"must-gather-l4ng7\" (UID: \"db68297f-00b7-4b1d-9df0-3912d92a62ff\") " pod="openshift-must-gather-m6qpv/must-gather-l4ng7" Apr 22 19:34:34.290771 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.290738 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d6h8\" (UniqueName: \"kubernetes.io/projected/db68297f-00b7-4b1d-9df0-3912d92a62ff-kube-api-access-8d6h8\") pod \"must-gather-l4ng7\" (UID: \"db68297f-00b7-4b1d-9df0-3912d92a62ff\") " pod="openshift-must-gather-m6qpv/must-gather-l4ng7" Apr 22 19:34:34.416275 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.416164 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" Apr 22 19:34:34.539461 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.539436 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m6qpv/must-gather-l4ng7"] Apr 22 19:34:34.541959 ip-10-0-134-109 kubenswrapper[2575]: W0422 19:34:34.541931 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb68297f_00b7_4b1d_9df0_3912d92a62ff.slice/crio-d62f6d6be8b032be6ce4109636fba036c61ddf7b6e547b9c5a69269c4ba04f87 WatchSource:0}: Error finding container d62f6d6be8b032be6ce4109636fba036c61ddf7b6e547b9c5a69269c4ba04f87: Status 404 returned error can't find the container with id d62f6d6be8b032be6ce4109636fba036c61ddf7b6e547b9c5a69269c4ba04f87 Apr 22 19:34:34.543716 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:34.543698 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:34:35.075163 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:35.075124 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" event={"ID":"db68297f-00b7-4b1d-9df0-3912d92a62ff","Type":"ContainerStarted","Data":"d62f6d6be8b032be6ce4109636fba036c61ddf7b6e547b9c5a69269c4ba04f87"} Apr 22 19:34:40.093656 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:40.093561 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" event={"ID":"db68297f-00b7-4b1d-9df0-3912d92a62ff","Type":"ContainerStarted","Data":"74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409"} Apr 22 19:34:40.093656 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:40.093606 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" event={"ID":"db68297f-00b7-4b1d-9df0-3912d92a62ff","Type":"ContainerStarted","Data":"de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e"} Apr 22 19:34:40.109972 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:40.109919 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" podStartSLOduration=0.856863677 podStartE2EDuration="6.109905232s" podCreationTimestamp="2026-04-22 19:34:34 +0000 UTC" firstStartedPulling="2026-04-22 19:34:34.543852884 +0000 UTC m=+2871.606337194" lastFinishedPulling="2026-04-22 19:34:39.796894435 +0000 UTC m=+2876.859378749" observedRunningTime="2026-04-22 19:34:40.108759614 +0000 UTC m=+2877.171243946" watchObservedRunningTime="2026-04-22 19:34:40.109905232 +0000 UTC m=+2877.172389563" Apr 22 19:34:59.154396 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:59.154358 2575 generic.go:358] "Generic (PLEG): container finished" podID="db68297f-00b7-4b1d-9df0-3912d92a62ff" containerID="de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e" exitCode=0 Apr 22 19:34:59.154834 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:59.154430 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" event={"ID":"db68297f-00b7-4b1d-9df0-3912d92a62ff","Type":"ContainerDied","Data":"de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e"} Apr 22 19:34:59.154834 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:59.154796 2575 scope.go:117] "RemoveContainer" containerID="de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e" Apr 22 19:34:59.198986 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:34:59.198956 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m6qpv_must-gather-l4ng7_db68297f-00b7-4b1d-9df0-3912d92a62ff/gather/0.log" Apr 22 19:35:02.532569 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:02.532539 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bh2v7_d1699627-d338-4301-b677-d1b46965f511/global-pull-secret-syncer/0.log" Apr 22 19:35:02.700875 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:02.700840 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vgg98_7d7e2f0a-fdb6-465e-81f4-dd71e4ceb2e1/konnectivity-agent/0.log" Apr 22 19:35:02.760223 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:02.760164 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-109.ec2.internal_b12aca16166beec2da98d7da868da1f9/haproxy/0.log" Apr 22 19:35:04.617001 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.616957 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m6qpv/must-gather-l4ng7"] Apr 22 19:35:04.617528 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.617272 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" containerName="copy" containerID="cri-o://74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409" gracePeriod=2 Apr 22 19:35:04.619798 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.619761 2575 status_manager.go:895] "Failed to get status for pod" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" err="pods \"must-gather-l4ng7\" is forbidden: User \"system:node:ip-10-0-134-109.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m6qpv\": no relationship found between node 'ip-10-0-134-109.ec2.internal' and this object" Apr 22 19:35:04.621359 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.621225 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m6qpv/must-gather-l4ng7"] Apr 22 19:35:04.841461 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.841433 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m6qpv_must-gather-l4ng7_db68297f-00b7-4b1d-9df0-3912d92a62ff/copy/0.log" Apr 22 19:35:04.841801 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.841787 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" Apr 22 19:35:04.843885 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.843853 2575 status_manager.go:895] "Failed to get status for pod" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" err="pods \"must-gather-l4ng7\" is forbidden: User \"system:node:ip-10-0-134-109.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m6qpv\": no relationship found between node 'ip-10-0-134-109.ec2.internal' and this object" Apr 22 19:35:04.951325 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.951242 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db68297f-00b7-4b1d-9df0-3912d92a62ff-must-gather-output\") pod \"db68297f-00b7-4b1d-9df0-3912d92a62ff\" (UID: \"db68297f-00b7-4b1d-9df0-3912d92a62ff\") " Apr 22 19:35:04.951325 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.951307 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d6h8\" (UniqueName: \"kubernetes.io/projected/db68297f-00b7-4b1d-9df0-3912d92a62ff-kube-api-access-8d6h8\") pod \"db68297f-00b7-4b1d-9df0-3912d92a62ff\" (UID: \"db68297f-00b7-4b1d-9df0-3912d92a62ff\") " Apr 22 19:35:04.952725 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.952693 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db68297f-00b7-4b1d-9df0-3912d92a62ff-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "db68297f-00b7-4b1d-9df0-3912d92a62ff" (UID: "db68297f-00b7-4b1d-9df0-3912d92a62ff"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:35:04.953487 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:04.953459 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db68297f-00b7-4b1d-9df0-3912d92a62ff-kube-api-access-8d6h8" (OuterVolumeSpecName: "kube-api-access-8d6h8") pod "db68297f-00b7-4b1d-9df0-3912d92a62ff" (UID: "db68297f-00b7-4b1d-9df0-3912d92a62ff"). InnerVolumeSpecName "kube-api-access-8d6h8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:35:05.052742 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.052703 2575 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db68297f-00b7-4b1d-9df0-3912d92a62ff-must-gather-output\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 19:35:05.052742 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.052735 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8d6h8\" (UniqueName: \"kubernetes.io/projected/db68297f-00b7-4b1d-9df0-3912d92a62ff-kube-api-access-8d6h8\") on node \"ip-10-0-134-109.ec2.internal\" DevicePath \"\"" Apr 22 19:35:05.175072 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.175042 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m6qpv_must-gather-l4ng7_db68297f-00b7-4b1d-9df0-3912d92a62ff/copy/0.log" Apr 22 19:35:05.175415 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.175394 2575 generic.go:358] "Generic (PLEG): container finished" podID="db68297f-00b7-4b1d-9df0-3912d92a62ff" containerID="74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409" exitCode=143 Apr 22 19:35:05.175474 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.175452 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" Apr 22 19:35:05.175511 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.175495 2575 scope.go:117] "RemoveContainer" containerID="74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409" Apr 22 19:35:05.177726 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.177699 2575 status_manager.go:895] "Failed to get status for pod" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" err="pods \"must-gather-l4ng7\" is forbidden: User \"system:node:ip-10-0-134-109.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m6qpv\": no relationship found between node 'ip-10-0-134-109.ec2.internal' and this object" Apr 22 19:35:05.183678 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.183658 2575 scope.go:117] "RemoveContainer" containerID="de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e" Apr 22 19:35:05.186084 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.186060 2575 status_manager.go:895] "Failed to get status for pod" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" pod="openshift-must-gather-m6qpv/must-gather-l4ng7" err="pods \"must-gather-l4ng7\" is forbidden: User \"system:node:ip-10-0-134-109.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m6qpv\": no relationship found between node 'ip-10-0-134-109.ec2.internal' and this object" Apr 22 19:35:05.195870 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.195850 2575 scope.go:117] "RemoveContainer" containerID="74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409" Apr 22 19:35:05.196165 ip-10-0-134-109 kubenswrapper[2575]: E0422 19:35:05.196138 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409\": container with ID starting with 74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409 not found: ID does not exist" containerID="74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409" Apr 22 19:35:05.196280 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.196169 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409"} err="failed to get container status \"74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409\": rpc error: code = NotFound desc = could not find container \"74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409\": container with ID starting with 74a7097b2b589667d1a987b15817e3edfe51bbbd9aaced79c2225b16dd320409 not found: ID does not exist" Apr 22 19:35:05.196280 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.196187 2575 scope.go:117] "RemoveContainer" containerID="de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e" Apr 22 19:35:05.196447 ip-10-0-134-109 kubenswrapper[2575]: E0422 19:35:05.196424 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e\": container with ID starting with de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e not found: ID does not exist" containerID="de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e" Apr 22 19:35:05.196486 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.196453 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e"} err="failed to get container status \"de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e\": rpc error: code = NotFound desc = could not find container \"de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e\": container with ID starting with de1694d3b5908c3fe712ccd8dedd7c799d492f17305506745dba625ebd2f712e not found: ID does not exist" Apr 22 19:35:05.469149 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:05.469113 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" path="/var/lib/kubelet/pods/db68297f-00b7-4b1d-9df0-3912d92a62ff/volumes" Apr 22 19:35:06.064835 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.064802 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-jcxvp_c7699e9a-8c09-4e2b-80d0-2164298c3efb/monitoring-plugin/0.log" Apr 22 19:35:06.103194 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.103161 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6ct76_b7cef634-5e2e-4422-9c18-541b2a2e35a4/node-exporter/0.log" Apr 22 19:35:06.127381 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.127350 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6ct76_b7cef634-5e2e-4422-9c18-541b2a2e35a4/kube-rbac-proxy/0.log" Apr 22 19:35:06.150727 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.150696 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6ct76_b7cef634-5e2e-4422-9c18-541b2a2e35a4/init-textfile/0.log" Apr 22 19:35:06.317383 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.317317 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c74zj_78a4a419-e37d-403e-996d-669a974f998b/kube-rbac-proxy-main/0.log" Apr 22 19:35:06.339282 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.339248 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c74zj_78a4a419-e37d-403e-996d-669a974f998b/kube-rbac-proxy-self/0.log" Apr 22 19:35:06.361096 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.361046 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-c74zj_78a4a419-e37d-403e-996d-669a974f998b/openshift-state-metrics/0.log" Apr 22 19:35:06.410873 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.410842 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c94c8fa3-6c4d-4510-a858-7f4981f6f54b/prometheus/0.log" Apr 22 19:35:06.428433 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.428403 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c94c8fa3-6c4d-4510-a858-7f4981f6f54b/config-reloader/0.log" Apr 22 19:35:06.451059 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.451034 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c94c8fa3-6c4d-4510-a858-7f4981f6f54b/thanos-sidecar/0.log" Apr 22 19:35:06.472969 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.472934 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c94c8fa3-6c4d-4510-a858-7f4981f6f54b/kube-rbac-proxy-web/0.log" Apr 22 19:35:06.496371 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.496343 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c94c8fa3-6c4d-4510-a858-7f4981f6f54b/kube-rbac-proxy/0.log" Apr 22 19:35:06.518572 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.518542 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c94c8fa3-6c4d-4510-a858-7f4981f6f54b/kube-rbac-proxy-thanos/0.log" Apr 22 19:35:06.540084 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.540052 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c94c8fa3-6c4d-4510-a858-7f4981f6f54b/init-config-reloader/0.log" Apr 22 19:35:06.570434 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.570350 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-tc87t_3b630301-6ad2-44fd-bf82-2128fb0cbf7f/prometheus-operator/0.log" Apr 22 19:35:06.587135 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.587106 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-tc87t_3b630301-6ad2-44fd-bf82-2128fb0cbf7f/kube-rbac-proxy/0.log" Apr 22 19:35:06.611077 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.611051 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-g7dg7_42404b27-22f2-4667-b5b0-82e7fbc1840e/prometheus-operator-admission-webhook/0.log" Apr 22 19:35:06.709935 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.709904 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545fb8c4d5-cbqvr_41cbca3c-525e-4447-aa48-6fec0572815b/thanos-query/0.log" Apr 22 19:35:06.733500 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.733467 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545fb8c4d5-cbqvr_41cbca3c-525e-4447-aa48-6fec0572815b/kube-rbac-proxy-web/0.log" Apr 22 19:35:06.754943 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.754912 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545fb8c4d5-cbqvr_41cbca3c-525e-4447-aa48-6fec0572815b/kube-rbac-proxy/0.log" Apr 22 19:35:06.776715 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.776683 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545fb8c4d5-cbqvr_41cbca3c-525e-4447-aa48-6fec0572815b/prom-label-proxy/0.log" Apr 22 19:35:06.799274 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.799249 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545fb8c4d5-cbqvr_41cbca3c-525e-4447-aa48-6fec0572815b/kube-rbac-proxy-rules/0.log" Apr 22 19:35:06.821728 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:06.821647 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-545fb8c4d5-cbqvr_41cbca3c-525e-4447-aa48-6fec0572815b/kube-rbac-proxy-metrics/0.log" Apr 22 19:35:07.991733 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:07.991648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-c7qdg_7b6d1ad8-a365-4cbe-a6c2-78e6855f83c2/networking-console-plugin/0.log" Apr 22 19:35:09.862751 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.862715 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5xxjt_a708e3d6-d406-4cfd-ab5f-8dd221a9fd88/dns/0.log" Apr 22 19:35:09.885295 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.885267 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5xxjt_a708e3d6-d406-4cfd-ab5f-8dd221a9fd88/kube-rbac-proxy/0.log" Apr 22 19:35:09.905460 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.905430 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps"] Apr 22 19:35:09.905757 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.905746 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" containerName="gather" Apr 22 19:35:09.905814 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.905759 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" containerName="gather" Apr 22 19:35:09.905814 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.905770 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" containerName="copy" Apr 22 19:35:09.905814 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.905775 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" containerName="copy" Apr 22 19:35:09.905902 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.905838 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" containerName="gather" Apr 22 19:35:09.905902 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.905848 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="db68297f-00b7-4b1d-9df0-3912d92a62ff" containerName="copy" Apr 22 19:35:09.911122 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.911100 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:09.913763 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.913736 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6dp56\"/\"kube-root-ca.crt\"" Apr 22 19:35:09.914803 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.914785 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6dp56\"/\"default-dockercfg-dcnhf\"" Apr 22 19:35:09.914872 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.914785 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6dp56\"/\"openshift-service-ca.crt\"" Apr 22 19:35:09.917669 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.917649 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps"] Apr 22 19:35:09.997874 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.997838 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-lib-modules\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:09.998107 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.997884 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-podres\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:09.998107 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.998009 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnx96\" (UniqueName: \"kubernetes.io/projected/3684597b-21a6-4a70-8209-755443479962-kube-api-access-hnx96\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:09.998107 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.998061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-proc\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:09.998107 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:09.998090 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-sys\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.033743 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.033707 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kfn2f_af1a0441-5fdc-4aa3-ac3d-26b54d430c2b/dns-node-resolver/0.log" Apr 22 19:35:10.099118 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.099080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnx96\" (UniqueName: \"kubernetes.io/projected/3684597b-21a6-4a70-8209-755443479962-kube-api-access-hnx96\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.099118 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.099128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-proc\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.099392 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.099148 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-sys\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.099392 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.099183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-lib-modules\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.099392 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.099247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-proc\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.099392 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.099253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-podres\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.099392 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.099297 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-sys\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.099598 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.099402 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-lib-modules\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.099598 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.099409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3684597b-21a6-4a70-8209-755443479962-podres\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.107396 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.107370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnx96\" (UniqueName: \"kubernetes.io/projected/3684597b-21a6-4a70-8209-755443479962-kube-api-access-hnx96\") pod \"perf-node-gather-daemonset-j6rps\" (UID: \"3684597b-21a6-4a70-8209-755443479962\") " pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.222244 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.222123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:10.346053 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.346024 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps"] Apr 22 19:35:10.348991 ip-10-0-134-109 kubenswrapper[2575]: W0422 19:35:10.348958 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3684597b_21a6_4a70_8209_755443479962.slice/crio-5ee2008ee77ee701d85fb66ca6753899b5a56b073f16c5245c6e54f9dc07e38b WatchSource:0}: Error finding container 5ee2008ee77ee701d85fb66ca6753899b5a56b073f16c5245c6e54f9dc07e38b: Status 404 returned error can't find the container with id 5ee2008ee77ee701d85fb66ca6753899b5a56b073f16c5245c6e54f9dc07e38b Apr 22 19:35:10.514086 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:10.513993 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rqzlh_ea1be31b-74a3-48cb-b181-97ff279b206a/node-ca/0.log" Apr 22 19:35:11.195459 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:11.195421 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" event={"ID":"3684597b-21a6-4a70-8209-755443479962","Type":"ContainerStarted","Data":"64cbf616d55a4cacd07fd00dded8d18884b4419c9c4764c140b3fb016ac0aaa9"} Apr 22 19:35:11.195459 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:11.195462 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" event={"ID":"3684597b-21a6-4a70-8209-755443479962","Type":"ContainerStarted","Data":"5ee2008ee77ee701d85fb66ca6753899b5a56b073f16c5245c6e54f9dc07e38b"} Apr 22 19:35:11.195958 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:11.195561 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:11.211512 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:11.211457 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" podStartSLOduration=2.211440537 podStartE2EDuration="2.211440537s" podCreationTimestamp="2026-04-22 19:35:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:35:11.210732426 +0000 UTC m=+2908.273216758" watchObservedRunningTime="2026-04-22 19:35:11.211440537 +0000 UTC m=+2908.273924894" Apr 22 19:35:11.542091 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:11.542061 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-lv9p4_a39d7311-24fb-45df-884c-194983c0905a/serve-healthcheck-canary/0.log" Apr 22 19:35:11.886989 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:11.886890 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-wprxw_c13de4f5-3c84-460a-9d6a-f2c326e0eef9/insights-operator/0.log" Apr 22 19:35:11.887991 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:11.887969 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-wprxw_c13de4f5-3c84-460a-9d6a-f2c326e0eef9/insights-operator/1.log" Apr 22 19:35:12.026970 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:12.026936 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rdfsn_b6b7df6c-540d-4973-8f17-dd5152793680/kube-rbac-proxy/0.log" Apr 22 19:35:12.048911 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:12.048877 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rdfsn_b6b7df6c-540d-4973-8f17-dd5152793680/exporter/0.log" Apr 22 19:35:12.070481 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:12.070448 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rdfsn_b6b7df6c-540d-4973-8f17-dd5152793680/extractor/0.log" Apr 22 19:35:13.950218 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:13.950170 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6f655776dd-gsgbt_9e48487b-20ef-406b-8bd0-d644eb493e21/manager/0.log" Apr 22 19:35:14.246551 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:14.246475 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-qxpn4_3de716eb-c772-414f-9e47-2f4ea744e76f/s3-init/0.log" Apr 22 19:35:14.273275 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:14.273239 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-k7vfs_7e3b6b69-4654-4a1c-af31-d0c26f95319a/seaweedfs/0.log" Apr 22 19:35:17.208888 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:17.208861 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6dp56/perf-node-gather-daemonset-j6rps" Apr 22 19:35:17.629960 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:17.629930 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-l97zl_97cb2698-cc23-43ca-8160-d8f09f48ade2/migrator/0.log" Apr 22 19:35:17.648989 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:17.648959 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-l97zl_97cb2698-cc23-43ca-8160-d8f09f48ade2/graceful-termination/0.log" Apr 22 19:35:19.298994 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:19.298960 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdhgw_46f110eb-3658-4771-b650-29f48ae2842a/kube-multus-additional-cni-plugins/0.log" Apr 22 19:35:19.319675 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:19.319644 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdhgw_46f110eb-3658-4771-b650-29f48ae2842a/egress-router-binary-copy/0.log" Apr 22 19:35:19.339800 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:19.339767 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdhgw_46f110eb-3658-4771-b650-29f48ae2842a/cni-plugins/0.log" Apr 22 19:35:19.361793 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:19.361764 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdhgw_46f110eb-3658-4771-b650-29f48ae2842a/bond-cni-plugin/0.log" Apr 22 19:35:19.386287 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:19.386251 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdhgw_46f110eb-3658-4771-b650-29f48ae2842a/routeoverride-cni/0.log" Apr 22 19:35:19.417535 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:19.417505 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdhgw_46f110eb-3658-4771-b650-29f48ae2842a/whereabouts-cni-bincopy/0.log" Apr 22 19:35:19.442354 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:19.442261 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vdhgw_46f110eb-3658-4771-b650-29f48ae2842a/whereabouts-cni/0.log" Apr 22 19:35:19.500547 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:19.500512 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rlzcp_2db298c2-8f21-4074-b8fa-de93cd62c24a/kube-multus/0.log" Apr 22 19:35:19.615130 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:19.615104 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zn7pv_16f54576-6941-4246-bcb0-89cfeef13253/network-metrics-daemon/0.log" Apr 22 19:35:19.637113 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:19.637082 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zn7pv_16f54576-6941-4246-bcb0-89cfeef13253/kube-rbac-proxy/0.log" Apr 22 19:35:20.584055 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:20.584009 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-controller/0.log" Apr 22 19:35:20.600469 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:20.600443 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/0.log" Apr 22 19:35:20.625973 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:20.625939 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovn-acl-logging/1.log" Apr 22 19:35:20.646630 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:20.646602 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/kube-rbac-proxy-node/0.log" Apr 22 19:35:20.667766 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:20.667734 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:35:20.684106 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:20.684077 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/northd/0.log" Apr 22 19:35:20.704418 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:20.704392 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/nbdb/0.log" Apr 22 19:35:20.726636 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:20.726600 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/sbdb/0.log" Apr 22 19:35:20.884434 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:20.884343 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p7bgj_e473ff71-80f3-4edf-8096-6fa108acae8a/ovnkube-controller/0.log" Apr 22 19:35:22.213350 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:22.213320 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-84n9f_f5a171a3-924e-421e-a715-95ec17243358/network-check-target-container/0.log" Apr 22 19:35:23.150879 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:23.150851 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rqxnk_ec7921c7-6ea3-40af-a74d-07b404ff9eb9/iptables-alerter/0.log" Apr 22 19:35:23.752355 ip-10-0-134-109 kubenswrapper[2575]: I0422 19:35:23.752324 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-4w69n_1a7ef120-7570-4923-8c42-9141c31054c0/tuned/0.log"